The new iPad is thinner, lighter and faster, but its biggest draw is yesterday’s features: Touch ID and a better camera, says The Guardian. Also reviews at The Verge, and Engadget.
The new iPad is thinner, lighter and faster, but its biggest draw is yesterday’s features: Touch ID and a better camera, says The Guardian. Also reviews at The Verge, and Engadget.
Yesterday’s features are only now being fully realized.
Contrary to some people’s opinion, touch ID was not solely a security mechanism to unlock your device. (ApplePay)
Apple sold 12.3 million iPads last quarter. 12.3 million units in 91 days. Right before new iPads that everybody knew were coming. Is that not “enough?”
Edited 2014-10-23 04:14 UTC
Well, the Air 2 has a Tri-core CPU (80% faster in benchmarks) and a new GPU that is more than twice the power of the A7 in the Air 1. The Air two has twice the RAM and a better camera, as well. And then on top of that, it’s thinner than the outgoing model and lighter and has the same battery life and better-looking display and TouchID.
Neither phones nor tablets have been exciting for years. I’m not sure why anyone expected this year to be any different. I guess people were expecting the thing to jerk them off or something.
The ultimate killer feature IMHO.
It’d probably kill the market for used models, though, unless you’re into that sort of thing.
Edited 2014-10-23 06:19 UTC
Feeling that the average Joe is getting dumb and dumber with time, even though the technology and information access is now unlimited, I wonder how much they actually use the devices’ capabilities to justify needing a harder, better, faster, stronger model. Have they already reached 100% of the limits from the previous models or is this just merchandising to its worst ? Or just a dick game ?
Kochise
Eh, what difference does it make? I’m sure some people just like having the latest and greatest. Far be it from me to judge how others spend their own money.
Not if you made it fully waterproof. A quick wash and no one would be any the wiser.
The problem is that tablets? Really starting to look like a fad, and NOT in the netbook sense, where the OEMs ended up pricing them out of existence thanks to MSFT, but an eBook reader kind of fad.
A LOT of my customers jumped on the bandwagon and when I ask them what happened to it? Its gathering dust somewhere or dumped on Craigslist for cheap, they just couldn’t find a valid use case for the thing. The cellphone with its ever growing screen is taking the “portable in my pocket” role and the laptop is so much more useful so where does that leave the tablet? in the sock drawer apparently.
That’s what I’ve noticed. When the iPad first arrived they were everywhere. Now tablets seem to be used mostly for entertaining young children at restaurants.
Could be also interesting read news not related to Apple or Google …
As good as the new iPad is, our business don’t need to upgrade from older models. The older iPad(s) with the latest updates are sufficient for our needs.
The upgrade cycle seems to be more like that of the office PCs rather than the mobile phones. That is what apple need to worry about.
Agreed. The first ipad was good enough for most everyone. The ipad pretty much fits all the needs that were outlined by Michael Arrington for the crunchpad. Not much of a need for anything else.
Just wait until Phones don’t really need to be upgraded any more (we are getting there), just like PCs and tablets (which occupy the same space – tablets do things that laptops used to do). Then the whining will really start.
at least in the UK we have been accustomed to phone costs being subsumed into our contracts. Contract ends after 18/24 months. May as well get a new phone rather than a sim only as “it only costs an extra few quid a month”
Horace Dediu, the analyst at Asymco, has an interesting take on the new iPad.
http://www.asymco.com/2014/10/22/the-new-ipad-is-it-better/
and Ben Bajarin at Techpinions also has an interesting take on the new iPad
http://techpinions.com/the-ipad-air-2s-huge-upside/35974
Both are about the new impact of the iPad in enterprise sector and professional uses. I will be interested to see what the first batch of new IBM enterprise iOS apps look like, they are due very soon.
The pundits and those who write for sites like The Verge, etc, have unrealistic expectations that every new model device must be totally new, be totally redesigned, and have completely all-new functions or they will trash it and be totally bored. Small iterations or perfections of existing functions are labeled as “boring” and dismissed. As it said, their expectations are unrealistic and unsustainable by the manufacturers, so I don’t pay any attention to those “boring” reviews.
Like Apple makes big updates to the iPhone ever second year, and the year after releases a very similar but better specced model, they do it with the iPad. No surprise here. This year’s iPad is the iPad Air S. Significant update to the iPhone this year = significant update to the iPad next year. I think is a good idea that brings some consistency and stability. If they were to randomly update their devices all the time, like Samsung, people get confused. And it’s not like they don’t earn a shit ton of money. I myself own an iPad 2. Not a very fun experience with iOS 8, or 7 for that matter (compared to release software performance), but anyway the MBA is so much more flexible and powerful. I’ll probably buy Android if I buy another tablet some time. My Nexus 5 is great.
Edited 2014-10-23 11:15 UTC
“The air 2 is the best tablet on the market but is that it?”
Funny stuff.
The Air 2 is more significant than the air in many ways, primarily with the doubling of RAM, better camera, and TouchID. This is the iPad to buy if you want it to last you a long time.
Before the Air 2, the iPad to buy for longevity was the ipad 2. The one was limited, the two was just about perfect and just now is reaching end of life (still usable but not the fastest). The three was impressive for its display but even with iOS 5/6 it had some trouble pushing all those pixels. iPad 4 was a minor spec bump on that. iPad air was fast but the 1gb RAM will restrict it in the future. iPad Air 2 will last for a long long time.
Doubling the RAM is not enough. I do not understand why Apple “hates” to put a plenty of RAM into its iDevices.
More RAM means more battery consumption…
Kochise
Kochise,
Sometimes I wonder why individual RAM banks cannot be powered off. This way when the system is idling and or doesn’t need all the ram, it could simply shut off the unneeded DRAM banks and conserve power. For all I know some special hardware may already support this, but I failed to find any information about it. It seems like it would be a simple power saving measure on desktops and mobiles alike.
To work optimally it would need help from the operating system, however I think most of the work is already done with the swap/paging systems. Instead of swapping for the purpose of allocating new ram, it would be swapping for the purpose of shutting down a bank of RAM.
This would add a bit of extra delay since unpowered RAM needs to be loaded again (presumably at SSD speeds), but it’s still no worse than a system that didn’t have that RAM to begin with. This way, you could add lots of ram – battery life would not be the limiting factor. The vast majority of the time when the device is sitting idle, most of the RAM could remain underpowered, saving even more energy than devices having less RAM.
Edited 2014-10-23 16:18 UTC
Unlikely, since Apple is switching to 64 bits, that means handling more RAM, no ? How the system is supposed when to enable the extra RAM banks ? Better leave them online all the time. BTW, the power consumption is the same between SSDs and classic hard drives.
Kochise
I think implementing a memory allocation algorithm that will work aware of memory banks is kind of too complicated. Because the MMU present in modern CPUs, every process can access to its own virtual address space that does not mean contiguous memory at all (such “uncontiguity” makes disk swapping possible).
ebasconp,
Haha, I was thinking the exact opposite: the MMU makes the problem much easier. Thanks to the MMU, processes don’t care where they reside in physical RAM. This means the OS is at liberty to move them around in physical RAM or even to Disk. To the extent that operating systems _already_ work with swap, then this is very similar.
Assume we’ve got 8 banks of ram, 512MB a piece, 4GB total. Now say the active working set is ~600MB, but assume the page allocations are scattered throughout all 8 banks. The OS could disable them one bank at a time by finding the least used bank and swapping it to disk OR moving it’s data to another bank. The end result is unused DRAM not needing to stay powered up.
Kochise,
Yes, but I’m not sure what that has to do with this idea. It could still apply to 32bit or 64bit.
For starters, if the system is only using a small working set (lets say 512MB), then there’s *obviously* no need to refresh the DRAM for cells which aren’t in use. They only need to be refreshed when they’re holding real data. In theory RAM could have some kind of “trim” commands (using SSD terminology) to tell the RAM which banks can be turned off/discarded. Alternatively it could be supported on a main-board by turning on/off entire DIMMs. I
Now the more complicated scenario is when the RAM really is in use, but it’s being used by idle processes that are “sleeping”. In this case, these would obviously have to be swapped out before the DIMM can be turned off. The mechanics would be practically the same as swapping out memory when a process needs more RAM than available.
If the size of the working data set doesn’t need all the RAM, then it makes sense to turn it off assuming hardware made that possible. I would think a fairly simple algorithm could be effective here. Like swapping, it would be completely transparent to applications; new ram would come online when needed, and shut off when not needed.
Where did you hear that? My SSDs use about 1/3 the power my regular HDDs do.
The increase is negligible considering the other components sucking orders of magnitude more juice (*cough* the display *cough*).
In this case the main concern as far as Apple is concerned is that doubling RAM cuts down profit margins.
Nope, they could sell the “RAM enhanced” model at premium price. See the 16 vs 32 GB smartphones where the added memory cost $100 when a 16 GB USB3 mass storage costs $20. Profit margin ? Naaaah, they owe you already…
Kochise
Not really. That approach leads to more SKUs, which adds further complexity to their chain. It also adds confusion to their marketing message. That ultimately leads to the potential of reduced margins.
So instead of having to mess with your well streamlined chain, which is hard work, all Apple has to do is simply make up some techno sounding talking points about how having half the RAM than your competitors is not an issue. Done.
Well they certainly are doing more with less. Apple’s processors are outperforming the Android competition at half the frequency.
iOS devices use far less RAM for basic operation than android and windows mobile. it’s one of many areas where iOS is more efficient and stable than the competition.
of course, no one knows how apple manages that, since they don’t have any real programmers, just marketers and UI designers, right?
Maybe iOS uses fewer resources because it has fewer features?
tylerdurden,
Yes, clearly that’s a big one, but a difference is that the display only needs to consume energy when the user is there using it. The RAM will go through energy as long as the device is on.
Hypothetically speaking a user who uses the tablet 3 hours a day might still consume most of the energy while on & idling during the other 21 hours in the day. Of course I’d be interested in seeing actual energy consumption break down in detail.
CPU’s have gotten much better in terms of throttling/turning off when idle, so I wouldn’t expect it to be a big culprit for idle energy use. DRAM, on the other hand, constantly needs refreshing to keep it’s integrity so it’s always using energy. I think there are things we could do to solve this like I mention in my earlier post (someone tell me why that was downvoted?).
Out of curiosity, what is your actual level of understanding of the current state of the art in RAM technologies?
I’m curious about your question. Which of the following applies?
1. You’d really like to know his level of expertise.
2. You’re making a hidden ad hominem attack by implying he doesn’t know anything about RAM instead of countering his claim directly.
3. You’re hoping he will reply with something you can show is wrong, which will imply he is wrong about other things without you actually having to prove anything.
I really doubt option 1, myself.
I’m afraid your post has to do more with your own projection than my simple question…
Edited 2014-10-24 02:54 UTC
tylerdurden,
Actually I also wondered if you were insinuating something by it, haha
I’m still looking for answers myself. I’m having a very difficult time finding tablet power consumption by component! For instance, when the tablet is sitting there unused, how much power is being used by idling components: wifi, ram, cpu, etc? I found this, but it’s for an obsolete windows tablet, where 80% of the idle consumption goes to the “others” category.
http://www.tabletpcreview.com/news/tablet-pc-battery-life-how-wirel…
We need a similar breakdown a modern ARM tablet.
Anyways, assuming Kochise was right that the reason for not adding RAM is battery consumption (I cannot verify it one way or the other), then being able to turn it on and off seemed an obvious solution to me. It gives us extra memory when needed, and saves battery when not needed. I wish people would just come out and say what it is they disagree with.
We all wonder things. I wonder why some of you feel so threatened by a simple question, which still remains unanswered BTW, haha
I’ll assume the answer is that you don’t know much about RAM technologies in general. This is not an indictment against your character, nor an insult, it’s just a way to gauge where the debate stands. It is telling that a few people, given the voting pattern, found that threatening somehow.
So let me clear up some of the mystery: The power consumption difference between the 2GB and 1GB configs is negligible. There are already a shit ton of power saving design/architecture features in place both at the silicon level, and at the software level as well (yes, some systems allow for turning on/off or clockgating unused banks). Also there is that many of the 1GB chipsets are actually 2GB/4GB chipsets. So the silicon is there, the leakage is there, it’s just that some functionality/capacity is disabled after binning/defect testing.
Edited 2014-10-24 16:20 UTC
tylerdurden,
I’m neither a noob nor an authority on the matter of DRAM, just somewhere in between. Fair enough? Mind you, I don’t owe you any answer here.
Fine, moving on…
That’s interesting. Do you have any sources that can provide specifics regarding OS being able to power down memory banks to save power? This is what I’ve been unable to find.
Do you have any evidence that 2/4GB variants would be shipped in 1GB tablets such that it would be relevant in this case?
Sure, here’s an old paper on a similar idea:
http://csl.cse.psu.edu/publications/dac02.pdf
Here’s some more modern techniques, although not handled at the OS level
http://www.cse.psu.edu/~xydong/files/proceedings/ISLPED2012/docs/p3…
There’s a shitload of material published on low-power RAM design techniques/issues/etc.
I don’t think you understood what I was referring to since I said this was done at the binning/packaging level. Those 1GB parts are not functional 2GB/4GB chipsets. It’s cheaper to produce a single high density parts run, and simply bin down. Rather than run different design fab lines with individual designs for each density target.
tylerdurden,
Thanks for those links, it looks the OS based power reduction ideas we are discussing have already been tested in academia. I’ll need more time to read the papers in detail, great find!
I did understand that already, as you are probably aware, they do the same when certifying the speed of CPUs. You were pointing out that the power consumption between 1GB & 2GB configs is negligible owing to the claim that they are already physically the same. If it’s true it would mean that manufacturers could spec more RAM in tablets while not incurring a proportionally higher battery usage. This is good, but I was inquiring whether you had any evidence that this is the case for tablets, or if it is just speculation?
The reason I ask is if I were a tablet engineer and discovered that the 2GB->1GB “binned” parts used more energy than than 1GB native parts, I might very well spec the 1GB native parts to boost better battery life for the tablet. Theoretically many energy saving decisions like this could save on costs (and weight/aesthetics) by eliminating the need for a bigger battery.
It didn’t sing praises about iPad Air 2? (for the record, I upvoted it)
Improving the processor cuts down on profit margins. Improving the camera cuts down on profit margins. Improving the display cuts down on profit margins. And yet they do that. It’s almost like there is more to the equation than profit margins.
No there is no more to the equation than profit margins, that is the whole point of existence of any mega corporation be it Apple, Google, MS, or whatever. The product is not the end goal, them getting your hard earned money in exchange for the product is. That you think otherwise is just a reflection that you bought (literally) into the marketing manipulation, this is you don’t see the corporation for what it is, but for what it wants you to see it.
The only reason why specs are updated is because they have to keep up with their competitors, but only if they really have to. In this case, Apple will sell you devices with half the RAM for twice the cost for as long as they can get away with it, once they figure out how to optimize their supply chain to get more RAM while keeping their profit margins intact (or increased), then and only then they will go ahead and add more RAM. And in fact they will go out of their way to let you know who they’re “revolutionizing” the market with all that RAM in their devices. Even if the week before they were trying to convince you how having half the RAM is actually a better thing.
This is not just an Apple thing, but they’re clearly the masters of it.
Bear in mind that iOS has been designed to function on way less memory than your typical computer. If the system is running low on memory, your app gets notified so that you can attempt to release some (e.g, by clearing your caches and releasing resources that you can recreate at a later point). If the memory pressure doesn’t eventually alleviate, your app simply gets killed. Que one star App Store reviews.
Also, if your app goes into the background, the OS may suspend it to save CPU cycles / battery life. A suspended app may also be terminated at any time if the system needs to reclaim memory.
As a result, the average app is forced to be a pretty good memory usage citizen. App developers are forced to think about saving the state of the app (data + UI) when it gets suspended (you are told about impending suspension so you can handle it). When the app comes back to life, you can then restore the UI. As far as the user is concerned, the app never did get killed. It should all happen instantly.
All this means that most of the system RAM is available to the foreground app while the rest are suspended or terminated. And not many apps need a full 2 GB of RAM to function. That’s the reason why iOS devices have been able to ship with so little RAM for so long.
iOS devices are basically media players. Android devices are basically small laptops. That largely explains the difference in hardware requirements.
You’re funny.
Effectively as fast as laptop computers from a few years ago — without a fan.
Software is the reason Apple hasn’t an ARM-based desktop yet. Software always lags behind hardware, but you can bet that Apple has an ARM iMac in testing, waiting for the right solution to software compatibility.
I have a mac in order to have a machine where I can run OS X, linux and windows in the same box (through VirtualBox). An ARM based Mac will render a mac useless to me (and I think almost all developers using a mac will agree to me)
But there is Linux for ARM, and there is Windows for ARM. Granted, the latter is almost useless, but the port exists and could easily be improved.
Does anyone still use a tablet for any serious work? I can’t remember the last time I saw anyone seen anyone doing anything more demanding than web browsing. They seem to have been almost completely replaced by phablets and laptops.
Used to use my tablet all the time. Got an 8 year old Thinkpad x61 (a true travel model) and it’s replaced my tablet. I can actually work on the Thinkpad instead of just surfing the web and watching youtube videos.
Still? I don’t know anyone who ever did.