And it was that time of the year again – Apple held one of its product announcements. This one focussed on the iPad mostly, and while some will call it a disappointment merely because virtually everything had already been leaked, I’m still in awe over the fact the newly announced iPad has a 2048×1536 display. My mind is blown.
Sure, we’ve known about this display for a while now, and it was pretty much agiven even before the event was announced. This will have surely dampened the enthusiasm for this event for some, but not for me. We’ve been promised truly high-DPI displays for almost a decade, and now it’s finally here.
We’re looking at a 9.7″ diagonal sporting 2048×1536 pixels – that’s a ppi of 264. To power this many pixels, the device comes with a new, quad-core GPU, while the CPU still remains dual-core. The camera has also been improved, while the front-facing camera remains a paltry VGA one (lame). Despite the obviously beefier hardware, Apple promises no drop in battery life compared to the iPad 2.
For a device that’s 95% display, this insane new display is a pretty big deal. This will officially bring high-DPI displays into the mainstream, and it’s my hope we’re finally going to see these on laptops and regular displays as well. It’s time, damnit.
While the iPhone has become ‘meh’ compared to the more interesting and diverse competition, we haven’t yet hit that point when it comes to tablets. The iPad 2 was the best tablet, and now the new iPad (it has no additional identifiers) is even better. The competition is simply nowhere to be found, and sure, this display has probably been developed by LG or whatever, but fact remains that Apple is the one to make it obtainable.
At a mere ^a‘not479 (NL)/$499. For once, I’m in full agreement with the Grubers and Sieglers of this world: there is no tablet market. There’s only an iPad market. I’m selling my iPad 2. I want this fancy display.
Other products announced include a new Apple TV capable of 1080p, and iPhoto for the iPad.
Did you notice they haven’t attached a number to this one? Me thinks the furor over the 4S/5 had something to do with it.
It is just the same as with Macbook, Imac, Mac Mini etc.
Only the Iphone still has numbers.
Anyways, I don’t get what all the fuzz is about. It has a 264dpi screen! Yup dee doo! Amazing, incredible, awesome, off the charts! And probably still low on memory and some old Cortex A9 dual core.
Edited 2012-03-07 23:23 UTC
Yeah, the resolution is nice, but the Transformer I had pretty much destroys the iPad in every other area, esp with ICS running on it. Attach the keyboard dock to it and it’s boss.
Unfortunately, when they released a port of Lemur for the iPad, that convinced me to sell my TF and switch camps, but wasn’t happy about it The iPad is nice and has some specific apps that Android doesn’t (esp in the area of MIDI control and softsynths), but to say there is no tablet market other than the iPad is very shallow indeed.
Edited 2012-03-08 00:16 UTC
Yeah, specially because Apple never made a keyboard dock for iPad v1, and bluetooth keyboards don’t work with it, especially the ones built into those keyboard cases people love so much </sarcasm>
But you still did it! You twit!
Congrats, you just disproved your own assertion. If there was a tablet market you killed it when you sold your “better than an iPad” ASSus. Good show!
Edited 2012-03-08 01:48 UTC
“Yeah, specially because Apple never made a keyboard dock for iPad v1, and bluetooth keyboards don’t work with it, especially the ones built into those keyboard cases people love so much </sarcasm> ”
Yeah keyboards dont really work with the ipad.
Sure you can use them to input text, but then you have no navigation between input fields keyboards on the ipad really suck.
Did you also not notice a difference going from a 320×480 screen to 640×960? The legibility of text is dramatically higher with the better screen no matter how much memory and how many CPU cores you have.
Correct me if I’m wrong here, but Apple made a big deal out of the iPhone 4/4s having 326ppi and went to great lengths to tell everyone that the human eye was unable to discern more than 300 or so ppi. They coined the term “retina” based on pixel density alone, not perceived sharpness at different viewing distances.
And now they’re saying this new iPad’s “measly” 264ppi is retina display spec as well, just because you hold it further away from you?!
Yeah, I agree that it’s nice to have resolution better than my 40″ TV or 13″ laptop LCD on a 9.7″ slate, and I know viewing distance plays a big part in perceived image sharpness….. but it’s still inconsistent with Apple’s past claims about what a retina display is supposed to be.
Edited 2012-03-07 23:19 UTC
I’m usually the 1st to find fault with Apple’s business practices, but in fairness here, everyone makes up BS terms to sell their display-focused devices – many of who dream up far more imprecise definitions.
Whether that be “retina displays”, the high def “standard” (how many resolutions are classed as “HD” these days?) or even LCD TVs that are miss-sold as “LED TVs” simply because they use LED back lights (rather than true OLED displays – which I’m guessing manufacturers are hoping consumers get muddled for). It’s all just worthless jargon.
So it’s really no wonder that consumers are confused when it comes to technology when the fact’s that they are presented don’t even hint at the true specs.
Edited 2012-03-07 23:46 UTC
I knew this was the case in my head, but didn’t quite do the math enough to figure that out.
It wouldn’t be apple if they didn’t claim something absurd that wasn’t true.
Still, this is the first ipad that I would consider buying. I hope this really just drives displays into higher Pixel densities all around. I want more than 1080 on my desktop!
Edit: doesn’t Apple still sell displays? Well, their top of the line display is 2560-by-1440 27 inch. Compared to the ipad’s 2048x 1536. So it has all of 540,672 pixels more for all of the extra size. Upgrade that already, will you apple?
Edited 2012-03-07 23:55 UTC
Hopefully the next gen Cinema Displays will be 5760×3600 somewhere between 22-30″ for 9x 1920×1200. I may not like Apple anymore, but I’d buy a display like that instantly.
Maybe a nice laptop at 3840×2400 at 15-18″ for 4x 1920×1200?
The cinema displays are still expensive as heck, but hopefully a competitor would do it as well and drive the price down.
I would actually upgrade my display ( only third time in last fifteen years) for them.
Edited 2012-03-09 16:07 UTC
If you go above 1920×1080 they are not expensive as fuck compared to the alternatives. I gave the market a honest probing before buying the 27″ and landed on the Apple monitor because it was a reasonable purchase.
Here’s the correction you’re looking for: when using an iPhone you’re usually holding it closer to your head than an iPad. Apple estimates that most people will hold the iPhone at a distance of 10″ (25cm for normal people) and an iPad at 15″ (38cm).
I can say that I mostly agree as I usually hold my iPhone at almost 35cm and my iPad at roughly 40cm, but their point is still valid.
At 330dpi a pixel is 0.077mm wide so the width of a pixel would be 0.0176470995"Es(degrees) at 25cm. At 264dpi a pixel is 0.096mm so the width of a pixel would be 0.0144747229"Es(degrees) at 38cm. As such, the pixels seem even smaller on an iPad3 (if you use Apple’s math). I assume they should look about the same at the distances I use. My math might be wrong, but it’s been ages since I needed trigonometry.
My lord, you people go to SUCH GREAT LENGTHS to find fault with something! If you like the way a product looks and it works for you, buy it, if not, don’t, it doesn’t take molecular analysis to make a decision about a tablet. You can’t tell the difference between 720p and 1080p on any tablet or laptop screen anyway, good grief.
I wasn’t finding fault with the actual product. I think it’s pretty decent, for the most part, tbh.
I just wanted clarification on this “retina display” term.
I always thought it meant “pixels small enough not to be discernable at normal viewing distance”, which would not necessarily be tied to a 300 ppi limit.
720p vs 1080p makes a massive difference especially for reading, documents & picture detail. That’s 2.5 times the pixels!
Yes, but it’s still great news. I’ve always wanted better resolution, and when the rest of the industry starts following, that’s what we’ll get. Hopefully for 22″ plus desktop monitors as well. I’ve noticed a sharp decline in the price of IPS panels lately.
FUD!
http://www.youtube.com/watch?v=ohMgjabfiUM 7:10
“It turns out that there^aEURTMs a magic number right around 300 pixels per inch that when you hold something around 10 or 12 inches from your eyes is the limit of the human retina to differentiate^aEUR| and at 326 pixels per inch we are comfortably over that limit.”
Steve Jobs 14. Jun 2010 at WWDC
So, are you saying it’s me or Mr Jobs that’s spreading the “FUD” ?
Thanks for the vid link, btw.
no, you spreading FUD!
gan17 wrote:
AND THIS IS ABSOLUTELY NOT TRUE!
and please do not play stupid…
Steve Jobs say: “It turns out that there^aEURTMs a magic number right around 300 pixels per inch that when you hold something around 10 or 12 inches from your eyes is the limit of the human retina to differentiate^aEUR| and at 326 pixels per inch we are comfortably over that limit.” at WWDC in 2010.
Well sorry, boo-f–k*n-hoo.
It’s still non-retina if it’s below 300ppi, imho…. regardless of what you say.
Edited 2012-03-09 11:57 UTC
I like your, I am going to close my ears and not listen to reason attitude.
Apple never defined Retina Display to be 300+ ppi. There is distance aspect to the calculation and it was mentioned in the iPad announcement with a diagram and everything.
Retina is a marketing name of high-res iphone/ipad display.
As you can see in the quote posted above it is 300ppi at 10″ distance.
Absolute 300ppi metric has no sense.
Shhhhhh! There’s hatin’ going on and you don’t need to try to mess it up with facts.
What a load of pedantic nonsense. Go to an Apple Store next week, look at a new iPad, can you see any pixels? How does the display look? Report back. Let’s move on. Please.
sorry 10-12 inches is way too close. More typical is likely 18 inches or more. For young kids the extra rez might matter. I personally don’t notice the difference and wouldn’t pick this tablet over a 1080p or even 1280×800 just for resolution.
A buddy of mine in the early 2000s had one of those tiny sony winows machines with the insane resolution. I frankly couldn’t make out jack without squinting at it.
Edited 2012-03-08 22:24 UTC
That means the DPI was set incorrectly for the screen.
A 12pt font should be the same size (as in, stick a ruler up to the screen to measure) no matter what the resolution of the monitor is.
Doubling the screen resolution should not halve the size of text onscreen. If it does, someone screwed up the configuration.
I think this comes from observation that on usual you keep tables at longer distances that the phone.
300 dpi is the golden number in the printing business, so I, too, am disappointed. The implication was that computer screens would finally be as sharp as print.
Of course, in reality I think it’s not that simple. Ink dots bleed into each other but pixels will always remain discrete.
You’re comparing a physical DPI to a relative DPI. Big difference.
Apple’s “Retina Display” is a marketing term, meaning “DPI at which you can no longer see individual pixels when viewed at a “normal” distance“. Look up the Trademark entry, the Patent, etc. Everything relating to the “Retina Display” term deals with relative DPI, based on the “normal” viewing distance of the device (which is different for an iPhone and an iPad).
While it would be nice to have a physical DPI over 300 on a computer screen (regardless of the screen size), it has 0 bearing on the “Retina Display” trademark.
Anyone know what the ppi are for the Asus Transformer due out later this year that will be 1080p? Frankly, there is no point in having a res that large, you certainly couldn’t see things if you used it all. I think this is mainly to make it easier to have applications scale smoothly between iPads and iPhones.
The Asus will bet 224ppi – so about 15% less density than the ipad, but should still be more than enough. The real question about Asus is what kind of availability the Transformer Infinity will have and when…
I thought of getting TF700 as an upgrade to my AC100, but Tegra3 is weak even for 720. At 1080p it performance will be pathetic.
You couldn’t see things if you used it at all? Or maybe text would be a lot sharper. I’ll let you guess which.
There are a lot of people with bad eyesight here on OSAlert. And they seem to think it is normal…
The one thing I was looking for was the announcement that you could use the dang thing without having to plug it into a desktop. Without this it is still crippled by requiring an external computer to sync with iTunes. This makes absolutely no sense to me now that they’re pushing iCloud and they have an iTunes app that runs directly on the iPad itself.
Can anyone tell me why in the world I or anyone else should want to connect to an external just to get my video podcasts to my iPad? Never mind the various mp3s and other movies I may or may not have–why can’t I use the internal iTunes to get my freaking podcasts???
Despite the addition of the rest of the iLife suite, this requirement from the 90s to sync everything makes the iPad nearly worthless, because it is made deliberately not a computer by crippling it in this manner.
–bornagainpenguin
You can sync stuff wirelessly if you want. Plus, with some apps (like AVPlayer and GoodReader), you can set up an http and/or ftp server to beam your files over, no iTunes required. Trust me, I hate iTunes, and the syncing with iOS5 is not as horrible as I thought it would be
WorknMan replied…
I don’t want to have to use special apps, I want to use the apps the manufacturer included and optimized for the device. I also don’t want to have to use any ludicrous workarounds to use the device.
It comes with iTunes. Why can’t I use the iTunes it comes with to get my podcasts? Isn’t that what the app does? Isn’t that how you’re supposed to get podcasts in the Apple ecosystem, from within iTunes?
Why do I have to use an external iTunes to download and then copy over the podcast when the device is capable of getting it and storing it all within itself. Is the iPad a computer or not?
WorknMan posted…
Actually I don’t mind iTunes so much as I mind syncing itself. To me it just seems like a 90s era technology that made a lot of sense for PDAs, made some sense for music players, made a bit of sense for phones, but makes no sense for a tablet computer. If it is a “real” computer (and having the complete iLife suite seems to imply it is) then why do I need yet another computer to make it work?
–bornagainpenguin
Sorry, I misunderstood you. If you want to listen to podcasts on your iPad, there are several podcasts apps available on the iPad that will download the podcasts straight to your device. You don’t need to use a ‘traditional’ PC for that.
Well, this is not a ‘real’ computer in the traditional sense. It does not replace a ‘traditional’ PC anymore than a bicycle replaces a car (although it CAN for some people. That’s not what it is designed for.
That being said, you don’t NEED a computer to use this. The only time I’ve had to interface with my PC is for file transfers. If you want to get a file from a PC to a tablet, you’re obviously going to need a PC, unless you use Dropbox or something. As for app syncing and stuff, you can do all that in the cloud if you want.
WorknMan posted…
It’s alright–my fault entirely. I noticed when I got back to the post that I’d skipped a word and it made the rest of the post come out wonky.
WorknMan posted…
Well there are video podcasts too, so not everything is intended to be listened to. The point is there is an iTunes included with the device, why can it not do its most basic function and get my shows?
Telling me to use a workaround (yes, telling me to use other apps to duplicate the missing features qualifies as a workaround in my opinion) doesn’t help–it breaks the whole Apple experience where things are supposed to just work. Worse it leaves me with a worthless app taking up space that cannot be removed from the system!
Never mind that as with all workarounds, Apple can break the functionality of those apps willy-nilly at any time.
WorknMan posted…
Tell that to people like my father in his late fifties, who considers the iPad to be his Personal Computer. Tell that to whomever it was that wrote Tim Cook’s speech littered with references to the post-PC era. And really, in the analogy you chose, this is more akin to trying to replace a car with a motorcycle that has had the engine disabled. You get the appearance of something fast and with advantages that a car does not have–only it has been artificially prevented from being true competition to the car.
WorknMan posted…
App syncing has improved tremendously. I really like the ability to do operating system upgrades directly on the device. These are improvements I’m happy to see Apple making, I just don’t think they go far enough to liberate the iPad from the PC.
WorknMan posted…
When it comes to basic functionality from a company that prides itself on its “offering the whole widget” approach to computing? You’re damn right I do.
This is not about the large numbers of apps in the Apple App Store–this is about basic functionality and an inability to abandon 90s era paradigms that no longer make sense. (Or if I wanted to be spiteful, I’d say this is about Apple guarding their lucrative laptop market and protecting it from being eaten away by the iPad…but I’d like to think the company a little more forward thinking than that…)
–bornagainpenguin
Gotta remember that analogy for later use. It’s brilliant !
Well, if a lack of a podcatcher out of the box is a deal-breaker for you, and you refuse to use a 3rd party app, hard to argue with that, as that is your personal preference. I’m not convinced that this is ‘basic functionality’ though, since Android doesn’t come with a podcatcher built in, and neither does Windows. I doubt WebOS does either, and Linux would probably depend on the distro. I wouldn’t expect iTunes on a tablet to be on par feature-wise with the desktop counterpart, as tablet apps are usually ‘light’ versions of their desktop brethren. As an example, Chrome on Android does not have all of the features that the desktop version does. Again, these devices are not PCs, so stop treating them as such. If you’re waiting for tablets to be on par feature-wise with PCs, you’re going to be waiting awhile.
As I said before, tablets are not designed to be PC replacements, but they CAN be for some people. I know somebody who sold her laptop when she got an iPad, because the iPad did everything she needed. And what exactly is wrong with that? If you don’t NEED a PC, why should you have one?
You need to go back and listen to Tim’s speech again. He made it very clear that the ‘post PC era’ does NOT mean and end to PCs, but only that the PC is no longer the center of the universe, which, based on the number of people I see living on their smartphones and tablets, is true.
You make it out like people are saying that tablets are somehow superior, or replacements to PCs, but nothing could be further from the truth. They’re just easier to use, and more convenient to use at times. As I stated above, they CAN be replacements in certain (limited) situations, but I myself have two PCs, and I still have a tablet, because it does things well that PCs don’t. For example, I was lying in bed last night watching Youtube vids. Tablets are good for that. PCs? Not so much. If I’m going to the crapper, I’d rather bring a tablet with me instead of trying to balance a laptop on my lap
iTunes app > More > Podcasts
Is that what your looking for? Glad to be of help. Any other issues?
]{
Does that allow you to search for, download, setup automatic series downloads, and save podcasts for later playing? All without connecting to a computer running iTunes?
That’s what the OP is asking for.
Do you know when the last time I used iTunes with my iPad was? Honestly? 6 months ago, when I upgraded to iOS 5. I’ve never needed to attach it since then. It backs up to iCloud. I download all content. Music comes from iTunes Match.
And now the OS updates no longer need to be connected to iTunes either. Looks like you’re going to go well past 6 months.
Hell yeah… One down already (5.01) and will do te 5.1 in, oh 10 minutes or so…. Just have to make sure iCloud backup is up to date. Phone updated last night, and once the update downloaded, I had to go out… So it installed on a 10 minute car journey at 70mph! No itunes touting computer involved.
henderson101 declared…
Awesome! Now tell me how do you add your podcasts to the iPad without connecting it to another computer, using only the software that came with it from Apple?
Yeah…I thought so. It’s crippled.
–bornagainpenguin
http://lifehacker.com/5855050/the-best-podcast-manager-for-iphone
That also works for iPad. Do you want to disqualify it just because it doesn’t come from Apple? There are about 500,000 apps in the app store; if you just counted what came with the tablet, that wouldn’t be a very useful device. Hell, I don’t think there’s anything built into Windows either that will download podcasts.
he just wants to have the same functionality on the stock app (that is supposed to manage your media) as the one in the desktop.
i understand him because it is also one of my pet peeves when i had the ipad(1). so a friend of my introduced me to doggcatcher in android. now my wife is happy with an ipad and i am happy with a transformer 2.
HangLoose posted…
Thank you, it’s nice to see someone who gets it.
I was really hoping that the release of the iPad HD would include a fully functional iTunes client as well as the rest of the iLife suite, which would push me towards getting one. Now instead I think I will probably just get a basic cheap Android tablet as “good enough” and sit out the iPad another rotation and see if the guys at Apple will allow their baby to leave the nest in the next revision. Could come sooner than later, given the rumors of an impending release of iOS 6 by summer… Maybe by then?
–bornagainpenguin
I feel for ya, mate.
Thing is, if Apple really don’t want people playing with how the iPhone’s file system, then why not create some sort of app that acts like a middle-man hub? Meaning, you activate the app, allocate a certain amount of storage space to it, say 4GB, and then plug it (or sync via bluetooth or whatever) into your non-iTunes PC. Your PC then sees your iPhone/Pad as a 4GB external storage device. Drag n drop your songs, podcasts and movies and unplug. Then the app takes over and sorts your files the “Apple Way” (whatever that is).
Should be simple enough a job for Apple to do, me thinks. We still don’t get anything close to robust file management, but at least us non Win/Mac users who don’t have iTunes installed can still transfer our media effortlessly.
Edited 2012-03-08 22:44 UTC
Short answer Downcast.. http://downcastapp.com/
Long answer, I only ever listened to podcasts on my phone, and iPhone you used to just do the “more episode” / “back to music” dance. Then I found Downcast. It absolutely blew my mind. It handles RSS feeds, updates Inteligently, will sync the play found and podcast list between devices via iCloud. It’s universal, and it’s not expensive. If you are serious about podcasts, take a look at it, or one of the other comparable apps.
I know about PC-free updates from iOS 5 onwards. but….
Most of my music I rip from CDs and encode myself. Most of the movies on my hard drive are torrented (yes, pirated)
Is there a way to transfer my music and movies to an iPhone/iPad running iOS 5.x without iTunes or any other software? I’m a Linux & BSD user, in case that matters.
Genuine question here. Last iDevice I owned was a 3G.
I don’t mind using iTunes on the actual iPhone/Pad but cannot/will not install it on my computers.
The iPad should be the last device you should get then. Apple is closed up tighter then a Dolphins butt. Buy a Android tablet, you can get Asus Transformer next to nothing now. They play pretty much play every codec you can through at it. Transfering files is a breeze as Android has a real filemanger that can see the entire drive, and mount any network drive. You’ll spend at least 80 bucks in the Apple Store to even match the Multimedia capabilities and free software that the Asus comes with. If you have a DLNA compatible tv you can just stream your movies directly to the TV. If you don’t then you can buy a device for 20 bucks that will add DLNA to any tv.
As a Unix guy myself who also has questionable media files don’t even look at a Apple you’ll kick yourself. Android is a open source OS, Android tablets you can hack the crap out of go to XDA and download a Kernel for overlooking the Asus to 1.6 GHZ. Plus with the keyboard attached you get 15 hours of battery, a real terminal that you can install a LAMP server on, and when the tablet is plugged into any computer its mounted as a storage device, no crappy itunes problem.
The list goes on and on, Apple is bad for a hacker, Android better.
My Android tablet suggestions are:
Samsung Galaxy 8.9
Motorola Xoom 1st or 2nd both good
Asus Transformer and Transformer Prime
Asus Slider, my favorite
On the cheap Arcos G8
Just get Plex. Makes your entire library available, without the need of having to synchronize or copy trough a wire.
Syncing with iTunes on the desktop stopped being a requirement since iOS5.
nefer said…
Great! Now tell me how to get iTunes on the iPad to download my podcasts without thirdparty tools or hacks and without needing to connect to a desktop with iTunes…
…
…
…yeah, I didn’t think so. I’m calling it as I see it–the iPad has been deliberately crippled for no good reason.
–bornagainpenguin
Get them on the iTunes store.
nefer replied…
Not trying to move the goalpost on you–honest I’m not–but while that works for one offs, where is the ability to subscribe to a podcast, the way you can do on the desktop iTunes client? The iPad version of the iTunes app only seems to allow you to download episodes one at a time. There does not seem to be a method of subscribing within its iTunes client.
In essence to taste a show you can use the iPad version of the client, but if you want to really get regular episodes to download you need to use the desktop to sync the device. The iPad is still crippled because of this dependency on a 90s era model of interacting with a mobile device. This completely curtails its functionality as a post-PC device, because ding ding ding you STILL need a PC to make it all work!
–bornagainpenguin
EDIT – fixed silly typo
Edited 2012-03-08 22:14 UTC
I must say I disagree. Contrary to desktops, you don’t have oodles of diskspace on a Post PC device. Being able to subscribe on podcasts – then forgetting about them – and have your disk consequentially filled with podcasts that you haven’t listened to yet – or have forgotten about entirely – doesn’t sound like a very attractive idea to me.
It makes much more sense not to subscribe them in the first place, and just get them when you want to listen to them. This way you have better control on what the storage on your post pc device gets used for.
Edited 2012-03-08 22:23 UTC
Until you really want to listen to that new podcast … and don’t have access to wireless/cellular so can’t download it.
There has to be a balance between “download things when you want to listen” and “have it already downloaded so you can just start to listen”.
nefer suggested….
That’s the first time anyone posted a reason for this default behavior that made a lick of sense! I still think its wrongheaded, because I think I should be the one to make that determination–not Apple. And even if your reasoning is correct this issue could still be worked around, much like the DVR people handle the issue. Apple could make it a setting for the iTunes application to keep at most X episodes. Or they could make it a setting to only use XX GB to store podcasts.
But not even offering me an option to subscribe unless I’m willing to dock the iPad to a desktop, a “real” computer is for the birds. Not everyone wants a 3G\4G account with a carrier (and the stupidity that is bandwidth caps) or will be in a place that always has WiFi. It’d be nice to be able to load up on podcasts ahead of time for those situations. Regardless until this gets fixed it means the iPad is for all intents and purposes crippled.
–bornagainpenguin
I am utterly confused.
Where are these podcasts stored and how will they get on the device without 3G/4G or Wifi? They are stored on a Computer, right?
How does being able to subscribe to podcasts from the iPad some how automagically circumvent downloading data on 3G/4G or Wifi?
Like the other poster said If you willy nilly subscribe to a bunch of podcasts on a whim on your phone or tablet a) You fill up space b) You chew that precious 3G/4G bandwidth.
I do agree that apple could improve the podcast feature and they just might sometime in the future if enough users complain. I understand it is a showstopper for you but I am completely at a loss trying to figure out what your are getting at with the above point.
Read the part you quoted.
He wants to be able to pre-load podcasts onto the iPad, using only apps that ship with the iPad (since iTunes is iTunes is iTunes, right?).
IOW, he wants to subscribe to podcasts in iTunes on the iPad. Have it automatically download new episodes while the iPad is sitting at home on wifi. And then be able to play these episodes whenever he wants while out travelling with the iPad.
No computer running iTunes involved in anyway. Everything managed and stored on the iPad.
Saying you want a Post-PC device without access to a wireless network is like saying you want a Graphical desktop without a mouse. Wireless, be it 3G, 4G or mobile, is an inherent part of the Post-PC experience.
Post-PC is NOT just about a smartphone. Its about a whole set of technologies and services that are finely attuned to and complement eachother to provide the whole widget. You either go for that widget or you don’t. You can’t be both on the bus and not on the bus. You can’t have it both ways and complain afterwards it doesn’t work properly.
Edited 2012-03-09 17:29 UTC
nefer posted…
Are you being deliberately obtuse?
All I meant by the mention of 3G\4G and WiFi not always being available was that streaming is not always an answer, nor is iCloud a substitute for having the media on the device itself.
Of course I want WiFi or some other kind of internet connectivity on my tablet. Not everywhere I go has WiFi though. Nor does everywhere I go have a good 3G connection let alone a 4G signal! I’m not speaking in hypotheticals here either, I know what I’m talking about.
The area I live AT&T is unable to provide more than EDGE connectivity! Personally that makes it useless to do anything beyond simple text and images…kind of like being on dialup. Sure there are other options in the area–I use Sprint myself–and several friends swear by Verizon. My Dad and Mom are sticking with AT&T despite the crappy EDGE connection in town because they think the free mobile to mobile minutes are worth it, and they can use their WiFi at home any way.
I’m sure our area isn’t the only one with this kind of situation. Funny thing is once they get about ten miles out of town suddenly they get a g
So what happens when theyreat signal, full 3G… Go figure.
But if they were forced to rely on that EDGE connection to get their podcasts, they’d still be downloading days later. And even then–if AT&T ever added 3G\4G support to our area, they’d still be up against the bandwidth cap. How many podcast episodes do you think it’d take before they hit 2GB? Considering the average episode size of just ONE of their shows is 199MB, and the producer releases one five times a week…yeah.
So obviously WiFi is the way to go. That’s not the issue. The issue is why should we have to connect the iPad to another computer to have their subscription work?
I’m not asking to be free from the WiFi, I’m asking why do we need to tether this “post-PC” internet tablet to the desktop? I’m asking for the chain to be taken off the device, for Apple to stop clipping its wings. The tablet is connected to the internet, let it connect that way on its own, without the need for some third party machine or application to get in the way.
nefer posted…
This straw man argument is getting tiring. No one is talking about smartphones here, the discussion is about the unnecessary crippling of the iPad while pushing it as a Post-PC device. My contention is that by forcing customers into an unnatural 90s era paradigm where they have to sync their iPads to a “real computer” to get their podcast subscriptions Apple is not providing the whole widget.
The digital hub of this decade is the iPad, not the desktop. Apple needs to let the iPad take its place, not hobble it in effort to maintain the last decade’s digital hub. They have converged almost everything that once connected to the Macintosh into one device–the iPad, and did a great job of it–but if they don’t get out of their own way they’re going to lose everything.
–bornagainpenguin
Who’s being obtuse?
1. You are presented with good reasons why the default applications shipped with a device do not provide the functionality you are used to on the previous platform paradigm.
2. You are presented with third party applications which present your broken previous-paradigm use case for you.
Yet you go on and whine it should be in the default installed applications. I don’t think it has a place there, and apparantly, neither does Apple.
The Macintosh didn’t come with a builtin VT emulator either. That didn’t stop other companies from providing them. Arguing Apple should have shipped them to provide “on-par” functionality with Mini computer terminals is just besides the point. The PC was a new paradigm. New paradigms have different methods and solutions to a certain problem. Projecting your old solution to a new paradigm and saying its crippled because its omitted in a newer system is just a daft thing to do.
Automatic subscriptions for podcasts on a Post PC device is just a plain dumb idea. You said it yourself, contrary to computers, network access varies from location. Network access is also prone to drop. There’s no telling what line you will be on at what point in time. Theres no way for the app to know that. Having podcasts subscribed in the background can lead to all kinds of undesired results, leading to partially downloaded files, corrupted and omitted data and files loading slow on slow networks, clogging up the little bandwith you have when its least desired, just to name a few. It just makes much more sense to download them consciously, so at least you know there’s a transfer going on, and you can initiate it when there’s time for it. Or having a central repository download it for you and stream/sync wirelessly when appropriate. Which is kinda doing the same thing twice anyway with the benefit of having stuff centrally.
Bottom line : Podcast subscriptions on post-pc devices are a dumb idea. You’re just doing it wrong.
He’s saying he won’t always be around an Internet connection, but would still like to be able to listen to podcasts on the iPad. IOW, the podcasts would be pre-loaded while near an Internet connection, for listening later when not near an Internet connection.
Too much consumerism and hassle. I want my stuff stored and stowed in one place I can browse through at my leisure. So what if I forget about a podcast for a few months? I have a backlog, what about files that may not always be available due to being DMCAd or due to lack of enough mass popularity to always be available on some server somewhere?
Then theres the problem of post release editing, what if a news cast online gets pulled and edited and all you saw was the stream? If you don’t have the file you can’t compare the 2 to see if it was recut post release.
Edited 2012-03-09 04:26 UTC
If you hate hassle, then why are you willing to trade a small hassle for a big hassle?
Post-PC devices aren’t made to be carrying a lot of data in them. If you really want to hamster all this data the way you describe, there are pretty good systems out there which allow you to, and which makes them available for streaming to your phone or any other device automagically without needing to physically store them on each of the devices you carry with you.
Edited 2012-03-09 17:40 UTC
So, you’re saying that buying a 64 GB iPad is a waste of time, since you can just stream everything over your 5 GB/month data plan? How’s about actually using that 64 GB of storage for something useful, like a couple week’s worth of podcasts to listen to later?
In 2004, I made a car trip, driving more than 2000 miles. I loaded my then-state of the art iPod (40GB) with songs. Since I took the trip alone, the iPod was on in the car all the time. Out of curiosity, I put the iPod on shuffle, and put it on pause so I only heard unique songs.
Guess how much % of the songs actually played after several weeks of car trip. We’re talking roughly 10%. Buying a 64GB is not a waste of time, its a waste of money.
Edited 2012-03-09 22:31 UTC
2000 miles at 50 mph is only 40 hours. At roughly 3 minutes per song, that’s only 800 songs. At roughly 3 MB per song, that’s only 2400 MB or about 2.5 GB of data.
40 hours isn’t all that much time to use as an “experiment”.
I have to cycle the songs on my phone’s 8 GB card on a weekly basis or else I get bored with the music, and that’s only listening an hour per day or less commuting.
What was your point again?
That your math sucks.
Well theres your problem, the quality of your files is low, I only have 320kbps MP3 and FLAC files. I’m no audiophile, but I also don’t listen to crappy music where you wouldn’t be able to tell the difference either.
Same here.
I re-encode all my FLACs in low bitrate for my portable devices. Carrying around the FLAC themselves on them is just plain silly. Even higher bitrate MP3s are a waste of footprint on them.
Making use of the latest MP4-HE encoders, I loaded more more than 3000 songs on my iPod. Playing time : more than 10 days non stop. Footprint : 17 GB. Who needs to carry his entire music library around? 90% of the stuff you’ll never listen to anyway.
So what? If it’s available there’s no good reason I shouldn’t be able to make full use of it.
Their is a Samsung Ice Cream Sandwich tablet with 2560 x 1600 resolution on it’s way this year.
The Android App store is also growing with top tier Apps.
And developers is flocking to it.
“Tablet market is the iPad market”?..RDF sorry
If it has’nt a super duper mega pixel awesome zillion pixels now it’s just not a tablet player?
Edited 2012-03-08 03:11 UTC
Source?
Growing yes, but not anywhere close to the number of good iPad apps out there.
Perhaps, but they’re flocking more to the iOS platforms.
It is, but just a crappier tablet. Apple has raised the bar again.
And now that the iPad 2 is available for $399 I think it will canibalize some sales from the other tablets again. That seems to be the price most of them have settled at (judging from the local Best Buy)… So if you can get an iPad for that price, why bother with the others? (and yes, I have tried them and they’re pretty crappy so far. And no, I don’t own an iPad or plan to buy one. Just use them at work a lot).
In the case of the Asus Transformer, you can get a keyboard dock that also extends the battery, and a USB port where you can plug in a mouse/gamepad/etc if you want.
You also get a USB and microSD card slot. And if you want HDMI out, you can use a $3 cable from Monoprice, instead of the $40+ dongle that Apple charges for.
As for apps, true that iOS has more, but unless your needs are very specific (like mine), the basics on Android are pretty well covered. Plus, you can run apps on it that are banned in the Apple walled garden, such as emulators. And I’m pretty sure you can run Ubuntu on it as well
And if you think Android tablets are sluggish, you should try one of these with ICS installed. Plus, it has Google services nicely integrated, and the OS itself is leaps and bounds above iOS in the various ways you can configure it.
Now, don’t get me wrong, the iPad is a great tablet, but certainly NOT the only game in town.
Why, are you looking for the Higgs Boson particle?
Stil many people that even prefer e-ink
Actually, I think that for work-oriented PCs which do not require good refresh rates and color rendering fidelity, large e-ink screens would kick ass
Edited 2012-03-08 17:38 UTC
Android tablets are mostly fed by phone apps that are layout adaptable to tablets.
Upsides – phone apps look and function way better on android tablets and enable transition devices like Galaxy Note.
Downsides – the form factor potential is less realized than in case of dedicated pad apps
Nevertheless – new iPad is the first one I’d consider buying.
A Display so hi-res I don’t need it, on a tablet so hampered by iOS’ incompetence I don’t want it. I’ll go back to caring about Windows 8 and the Cyanogen Mod for my Touchpad.
Let us not forget that the Nokia N770/N800/N810 series of “internet tablets” came with 225 PPI displays, just a step below what Apple is touting today, and this was 5 years ago. Still use my N810, but to be honest, my eyes can’t make use of the high resolution. Any PPI over about 100 is a waste of time.
You know, for many a year I had wanted one of the Nokia Internet Tablets, after I had first read about them. Finally was able to afford the N900 when it came out and you know what???
I Absolutely LOVE IT!! All the bullshit that happened afterward makes me nothing short of furious at how everything I have ever loved and appreciated technology wise always gets set up to become an underdog.
I loved the Amiga, it all but died, loved the N900, Maemo was canceled then semi-brought back with the N9 (bought one of those too), etc. If I had been old enough at the time, I probably would have loved Beta over VHS. Though I did pick right with Bluray, it’s still arguable whether or not it’s the better technology, though I think it is just on the massive size they can hold.
But I’ll never like Apple products, and I never have. I have respect for Wozniak, but past that everything that has anything to do with Apple has been crap, especially the love they now have of patenting crap that others have long since been doing.
http://apple.slashdot.org/story/12/03/07/2325259/apple-wins-patent-…
To be honest the N900 was the first okis iteration of maemo, the N<=810 were really bad. Bad UI, sluggish… So good thing that you waited for the N900
kid: “Wow mum you got me an iPad for my birthday”
mum: “blah blah, I knew how much you wanted it”
kid: “shit!!!”
mum: “what?”
kid: “I wanted the iPad, not the iPad!!”
http://www.youtube.com/watch?v=Z9obgyYB1IU
:):):)
Give me a macbook air with that res and that price. Who cares about a tablet.
That’s what I was saying earlier! A 9.7″ screen at that resolution is not only wasteful in usage, but wasteful in battery and GPU/CPU usage.
Stick it on a tablet or a large screen LED display!
Oops, I meant stick it on a Large screen display, not a tablet.
Seriously, I would die for a computer screen that is 32″ or bigger with that resolution.
I think my 21″ CRT would do that resolution, but only at 60hz, which made it unusable. I would love an LCD/LED that would do it. Well, change that, I want an LCD/LED that does a wide screen version. 4:3 ratio is so 1990s…
Widescreen LCD that would do it?
27″ Apple Cinema Display 2560×1440
Yeah, what the hell are you talking about. Buy a 30″ monitor that does 2560×1600 or a 27″ at 2560×1440 then – that’s all been out for years. Getting a tablet that is essentially (older) laser printer quality on a 9.7″ screen, that’s going to be fantastic for browsing and reading.
Laser printers are 1200dpi, even the very cheap ones for home use are at least 600dpi. 300dpi is the resolution of 1980s dot-matrix printers, or very early crappy inkjets.
well… NO.
best Dot matrix printers from 80s have up to 240DPI resolution.
First laser printers (LaserWriter, Atari SLM 804, LaserJet…) have 300DPI.
It seems about 3.75 million people who buy an iPad every week do.
Though I’ll also upgrade my Air when they release a retina display for it. Awesome for coding…
Wow, the appeal to popularity fallacy sure is… well, popular with Apple fanboys lately. You should probably dress it up with some convoluted justifications though, I’m sure ilovebeer would be happy to share his with you.
Macbook Air for the same price as an iPad – not gonna happen. Intel CPU + Motherboard is a lot more expensive that the A5X, and they require a much bigger (and thus more expensive) battery as well.
That makes sense, but oh how I wish…
That once people see this screen, they will realise how crappy most laptop screens are.
Then I hope we can start to get laptops with displays with resolutions like this.
I’ve just been given a new work laptop. The vert resolution is 768 pixels.
WTF?
This is so 2000.
I’d like the numpties who made this decision (the old ones had 1050 vertical) try doing decent software dev on it using the current generation of tools.
Don’t even get me going on how much a waste of space the MS Office ribbon is on this size screen. What a joke.
I’d certainly pay a small premium (not the current arm and two legs) for a Laptop with a decent screen with at least 1200 vertical resolution.
When I’ll have to change my 4:3 thinkpad for one of those “built for entertainment” business computer and their 16:9 screens, it’s gonna be a sad day indeed…
Totally agree.
You look at the specs of some of these laptops, and say good specs good specs, then you see the resolution and say shit!
I think that is the only thing keeping price down.
I need that resolution, but I know having dealt with the elderly they need bigger writing on the screen instead. So there is a market for both, and therefore I think they shouldn’t stick a premium on high res.
hmm, almost exactly the same as the previous ipad but with a better display – surely they could have done better than that… at least have faster connectivity than the previous would have been good in Australia!!! I guess apple will advertise 4G and people here will think apple’s incompatible 4G is faster on our 3G network
FTA:
Actually, there is a tablet market.
http://venturebeat.com/2012/03/07/ipad-tablet-market/
Apparently in ONE year, Android stole 37% of the TABLET market from Apple.
Finally, I agree with you on something
– posted from my Asus Transformer
Obviously, you don’t understand Gruber’s logic. There are two markets, one for android tablet and one for iPads, and in the iPad market, Apple is unchallenged leader.
i thought samsung was challenging apple in the ipad market. isn’t that what those lawsuits were about?
@ Thom:
and sure, this display has probably been developed by LG or whatever
Apple did claim they developed the technology themselves. They say the same on their webpage:
In order to create a display with four times the pixels, we had to design it in a completely new way.
And for the Iphone 4:
By developing pixels a mere 78 micrometers wide, Apple engineers were able to pack four times the number of pixels into the 3.5-inch (diagonal) screen found on iPhone 4S and iPhone 4.
Apple has also filed several lcd patents, so they really seem to do their own research.
edit: http://www.patentlyapple.com/patently-apple/2010/08/apple-patents-p…
Edited 2012-03-08 09:15 UTC
I’d really love to know why I get modded down – anything wrong with my post? Or is at least Apple lying?
Not sure if they designed these ones, but on the first two ipads they had two manufacturers with visible difference between screens. If it was a single design, they would be identical. There’s no such thing as “artistic” difference among manufacturers that could explain the difference when it comes to making a product following defined specs.
They didn’t claim to design the displays user in iPad 1 & 2. And I guess their their ideas could still be combined with different production processes of different manufacturers. But at least the display of the iPhone 4 seems to be produced by LG exclusively.
OK, “Apple engineers helped developing” “or added some ideas” would probably be closer to the truth.
The sentiment on this board is that if you don’t manufacture things yourself, you’re basically a rip off of other peoples technology. Or at least you are if your company name is Apple.
What most so-called pundits fail to realize here though is that Apple often makes significant investments in putting together assembly lines at other companies to run these assembly lines for them – with technology they engineered.
does anyone know how small the pixels in an lcd-beamer are?
Keep in mind, especially with a public-facing webpage, how it’s not only from a company very focused on marketing …but also on a marketing of not always the best integrity.
They also had, a decade ago, whole pages devoted to PowerPC “supercomputer on a chip” G4, based on few hand-optimised, hand-picked edge scenarios of SIMD benchmarks (few Photoshop filters and such) – all the while what was available on the PC side generally basically destroyed them, performance-wise.
Such dubious, at best, PR is in their blood since the earliest years… for example http://arstechnica.com/old/content/2005/12/total-share.ars/3 (considering, apparently, some hardships at the time mentioned in that article, it’s even remotely conceivable that the company would fold a long time ago, without such lies… emphasis mine)
Or consider how frighteningly large proportion of OSX users seems to believe they are immune from online threats …basically because Apple says them so (while their OS doesn’t have very impressive record on some annual hacking contests; while iOS has, from time to time, a “jailbreak in a browser” – really, a root exploit for any random webpage)
This here with displays might as well be largely a marketspeak for people who don’t even know anyway that, say, Apple doesn’t strictly build any of their stuff.
Whoopty-fucking-doo, so Apple crammed 3145728 pixels in a 10″ screen. Now I feel myself caring less that their product is so locked down that I can’t even have a stupid file manager and that every application needs to have their own fucking copy of the files. Oh, and also, I’ve totally forgotten that Apple constantly tries to kill all competition by claiming to have invented the rounded-corner rectangle. And it totally slipped my mind that Apple promoted this whole crappy walled garden concept, that will make computing as exciting as programming your washing machine.
And what’s this fetish with high-resolution? It’s not like there is a perceivable difference, between 1280×800 and 2048×1536 on a freaking 10″ display. You resolution snobs are really as bad as audiophiles. I’ve heard that there’s a nice gold plated, pure-silver, high-definition, super-retina clarity, dynamic filtering power cable that will triple the count of useless pixels on your iPad’s screen.
Actually, if I think about it, it’s pretty crappy that Apple’s UI isn’t truly scalable and every time they want to enlarge the resolution, they need to double it, so it doesn’t look like crap.
I didn’t notice much difference between 1280×800 and 2048×1536 either. But after I re-downloaded all my video’s with my new ethernet cable (http://www.amazon.com/Denon-AKDL1-Dedicated-Link-Cable/dp/B000I1X6P…) I can almost taste the image.
There is little to no difference when consuming multimedia or playing games.
There is a hell of difference when reading text. It’s not because you do not use a tablet to read text that no one does. Let me tell you that when you read for hours on a lcd, sharpness counts.
I’ve read quite a few books on my Asus Transformer and, in fact, since I got it, my Kindle and my physical books have been gathering dust. I don’t feel any eyestrain and never have I stopped reading because my eyes felt tired. Have you read any books on the iPad 3, to be able to make the comparison?
Regarding the 3 million pixels – did you know that an old 72 DPI dot matrix printer is just as high resolution as a 1200 DPI laser printer output held 2 feet from you if you hold the dot matrix sheet just 33 feet away from you when you read? Makes you wonder why anyone wanted something better when what we really needed was just bigger paper, right?
I’m guessing most people haven’t had a chance to compare the iPad 2 screen to the iPad 3, but many have compared the iPhone 3gs (or earlier) screen to iPhone 4(s). Have you not?
Never let any facts get in the way of an anti-Apple rant, huh?
Billboards are printed in very rough rasters, just for that reason. The viewing distance is so large, dotsize doesn’t really matter all that much.
This being said, you do hold a tablet at a greater distance than a smartphone. Simple reason number one : the screen is bigger. I can often find myself staring at my smartphone screen at just a couple of inches. I don’t see this happening with a tablet. Hence the reason why too fine of a pixel technology would be overkill anyway just like a 150 lpi raster would be on a billboard. More pixels = more processing power required to drive them. If it goes to waste anyway because of the viewing distance, why bother.
The sixties called. They want the file manager back.
Try reading large amounts of text. Its much better as an e-reader. No need for blurry anti-aliasing anymore, things can be crisp.
Its a choice, not a software limitation.
What does this even mean?
Been there, done that. 1280×800 is good enough … hell, even the iPad 2’s resolution is good enough …
Yeah, sure, Apple can do no wrong …
It means that after half a century, its time to let go of the old crippled ways of handling things.
I’ve been waiting especially for the new iPad to get the high resolution display. I’ve been reading text on screens for decades, and do a lot of reading on my laptop screen. Even with the best anti aliasing in a modern OS to make text look good on normal resolution displays, its still basically a band aid. Text doesn’t look as crisp as it could be. It doesn’t come near the crispness of text in a book. With high resolution displays, text on these devices is actually sharper than books in print.
Its like HD resolution. Once you get used to the sharpness, everything regular looks muddy.
I’m not throwing any sentiment around, I was just stating a fact. Vector based UI features have been found in OSX for as far back as 10.4.
Edited 2012-03-08 22:07 UTC
Having the ability to manage your files yourself is vital for people who use the iPad as a creative tool. Try using Adobe’s Photoshop Touch or something. As you create multiple variants of the same piece, you’ll definitely miss the ability to sort them in folders and sub-folders.
Traditional file managers are inherently flawed with todays data volumes. They’re so broken beyond repair its not even funny. It was fine up until we had disks which were measured in megabytes. It crossed the line of manageability when we had disks with more than a gigabyte of space. Do you know ANY system around you that don’t have junk hierarchical file structures of somesort with accumulated cruft that nobody has bothered to check in ages because they don’t know what resides in them and they don’t really care to weed them out to begin with? Do you use the file manager to retrieve date from them? For both questions, the answer will be almost always no.
That still doesn’t mean squat to someone who wants to sort his files his bloody way.
You can spend millions in R&D creating a foolproof filesystem that thinks it does an optimal job, but at the end of the day people who create stuff will still want to sort their work the best way they see fit.
Exactly, one of the many things I dislike about Apple. The attitude of you’re doing it wrong if you’re not doing it the Apple way.
As I’ve said numerous times, the tool needs to adapt to the work and not adapt the work around the tool.
Nobody forces you to buy Apple products. If you want to use old paradigms on new computers, there are a range of alternatives to choose from.
I’m sure you can find a tablet to run COBOL if you would need to. Most people, however, will move on with the times and leave filesystems for what they are.
IOW, since (some) people can’t be bothered to organise their files, we should remove all capabilities to organise files? How backward is that.
I know I’ll get modded down, but what the hell…
There are historically two common interface paradigms in GUI design – document centric and application centric. In document centric systems centralized file management is not only desirable, but pretty much mandatory – you need some kind of generalized file management UI.
Almost all GUIs in common use combine both systems at the same time. iOS on the other hand is completely application centric – it was designed to work that way.
Whether you like it or not, adding file management to an application centric UI corrodes it – the whole point is to avoid it entirely. It is a trade-off. It simplifies things for the user dramatically because, if done right, file management simply becomes unnecessary.
Is iOS done right? it certainly has its flaws… But at this point if Apple breaks down and implements a “Finder” on iOS, they may as well admit defeat. I’m not saying there aren’t things that need to be fixed to improve usability, but adding file management is the last thing they should be doing. That’s like adding a steering wheel to a train – once you do it you no longer have a train…
I’m just saying, it’s not backwards – its simply different. You may not think it is better, but some people at least think it can be better if done right.
Well, first of all, Apple makes things more and more simple, to attract an increasing number of people, clueless about computing, to their products. But, what is good for the clueless is really bad for people who do have a clue and will quickly feel the limits of Apple’s paradigm. What’s even worse is that other manufacturers see the piles of cash that Apple is gathering and want a piece of that. So, they start to copy Apple’s closed world. If this trend continues, I believe that in a few years there won’t be any devices that are not dumbed down appliances.
Of course, if you let people do a few very specific things, while holding their hand, you’ll be able to provide them with a highly specialized tool that does those things very well, but then don’t sell said specialized tool as a general computing device, because it isn’t, it’s just a device that can do a limited number of tasks, in a carefully selected and limited, number of ways.
It’s like if you had a kid and you’d tie him with a leash to a cable that ran to only a few nearby places in town. Of course, he won’t get lost and nothing bad will happen to him, because you carefully selected the route the cable will follow, but what life would that be?
And now back to the app centric paradigm … I think it sucks, compared to the document centric one, no matter how good it’s implemented. It sucks, because it emphasizes the mean and not the purpose. As a user, I shouldn’t care about the tool I use (of course, Apple hypes users into caring), I should care that I need to do something. In fact that’s how the real world works. Imagine all the nails tied to certain brand of hammer, all the planks tied to a special saw etc. It’s madness.
I’ve not used the iDevices much, so please correct me if I’m wrong, but it seems to me that a lot of apps reinvent file management in their own little bubble and their own different way. So you might see an app that doesn’t group it’s documents at all, one that uses folders, one that uses tags and so on … In a way they still do file management, but crippled and limited. The classic approach is clearly more flexible and consistent, but, sure, it gives the user the opportunity to mess things up.
Which brings us to the point that Nefer was trying to make – that the app centric approach saves people from their own messiness and I would add … stupidity (when it comes to computing). I’d argue that helping people to persist in their ignorance will no help them. Sure, it may make you a lot of money if your name is Apple. And let’s not forget that not all people are stupid, unwilling to learn or clueless about technology. What they are doing is lowering the bar even for those people and in time you might even see them disappear, because considering the current trends, there might not be a place where they can manifest themselves.
Which brings me to the conclusion … an user oriented software ecosystem should be about the tasks and documents, because that’s what the user needs and focuses on, while Apple’s ecosystem is about the apps, the tools, because that’s what Apple (and soon Microsoft) focuses on – to sell more of them. I’ll go with the user, instead of the money-hungry corporation any day.
I’m not clueless about computing… I have been involved with application/web development for nearly 20 years, and I am attracted to the notion of making things simpler. Just because people have a different viewpoint from yours does not make them “clueless”. I don’t want to carry around a 10 inch tablet if it works just like the PC on my desk, because the PC on my desk needs a damn keyboard to be remotely usable… I have a portable PC already, its called a laptop.
I routinely use 4 or 5 different programs that all operate on raster graphics files. Each one is different and has different capabilities – I need all of them at one time or another. Sure, I can work with these apps in a document-centric fashion on Windows – it supports it just fine, very well in fact. But I still have to tell it which app I want to run when I right-click the file, because it can’t read my mind. At that point the paradigm is no longer transparent (and in fact illustrates that it can’t be transparent, unless you always have a 1:1 relationship between application and file type).
Application centric UIs essentially concede the fact that they can’t read your mind to determine intent. They make file management transparent instead. The difference is that file management can be transparent, it serves no actual purpose in computing and can be engineered away. Applications inherently know what kind of file they work on…
Nails ARE tied to certain types of hammers… A nail gun uses a different type of nail than a standard hammer. And there are different types of hammers, it depends on what kind of nail your hammering… Same with saws, there are different types of saw blades depending on what kind of wood you are cutting. As the old saying goes “pick the right tool for the job”. Notice it doesn’t say “pick the right job for the tool”…
I’m sorry but your argument here is just simply wrong. application centricity IS modeled after the real world, that is how people actually work.
Yes, that is kind of the entire point.
You see “crippled and limited”, I see “optimized for the tool”.
Its not about keeping the user from “messing things up”. Making file management application centric simply allows the application to determine the best way to interact with the types of files it operates on. Most applications abstract file management away completely, some still retain it but in targeted ways.
Yes, there is definitely a tradeoff as far as consistency goes. But it is arguable more flexible, as it can be made to work with virtually anything, even things that are not traditionally considered files at all (purpose built web services, dropbox, etc.).
I don’t know why you are bringing corporate motive into this – I’m strictly talking about UI design. Sure, Apple wants to sell more stuff – Microsoft too. They do that by giving people what they want, and they want things to be simpler. Your argument is basically that they shouldn’t give users simplicity because it is bad for them. All those people buying iPads seem to not agree with you…
First of all … I haven’t said that anyone in this thread is clueless about computing. I said that there are such people – probably the majority
It’s funny, I want the same power from my tablet and my phone as I want from my desktop (of course with touch friendly UIs). Why wouldn’t anyone want all that power? The form factor and the input method is where the differences are at.
Interoperability is such a beautiful thing. If I’m pissed at a tool, I can just open my files in another application, not have them hostage in the app bubble. OK, so you have to click twice, why is that such a big deal? At least you have the option to open your file in another application quickly without going through some ridiculous hoops.
No it can’t, at least not successfully. For the love of everything that’s holy, why would you want to have all your documents and files scattered around in tens of proprietary applications? Did I use X Office Suite to make that file because it is more user friendly or Y Office Suite because it has more features or Z Office Suite because I wanted to try it out? Let me open all three and see where’s my file. The alternative is to open your Documents folder, find the file immediately and use the right tool to open it. What if I want to copy it to Dropbox? Let’s say you found it in Y Office. Click share and find out that Dropbox is not an option (just happened to a friend on his iPhone). WTF do you do now? Instead, I’ll open my filemanager and copy the file to the Dropbox folder in the same consistent UI I’ve used to open it previously.
I’m talking about ordinary nails and ordinary hammers. Don’t stretch your argument just to make your point, please. I have ten nails, a blue hammer and a red hammer. Some days I’m feeling like using the red one, some days I’m feeling like using the blue one. Why would I want to limit my option to use one or the other? My final purpose is to put the nails through a plank, not bask in the delight of using a hammer. I might use the red one because it’s more balanced, or I might use the blue one because it fits in a more narrow place, but both are compatible with the nails and I don’t have to scratch myself backwards and say an incantation to the spirit of the late Steve Jobs, just to be able to use a different tool more suited for the job.
I haven’t said such a thing. You are just twisting my words. I said I want the freedom to choose the best tool for the job, not have the job tied to the tool.
And I disagree, people have a problem to fix, and pick the best tool that’s available for the job. If they decide in the middle of the job that another tool is more suited, then they drop the old tool and use the new one. The constant is the job (document/file), not the tool (app).
What, that they are inconsistent and reinvent the wheel time and time again? (only that sometimes it’s not round, it just has rounded corners)
Maybe or maybe just incomplete, as someone else said about Photoshop touch in another post in this thread.
That’s where I don’t agree. It’s not the application’s place to determine anything, it’s the user’s right to pick the right tool for the job at hand.
You need to elaborate on this one … I really can’t see your point here.
Because if you emphasize the tool, then people will buy more tools.
I love simplicity. In fact, all the thing I design are as simple as possible. But simplicity isn’t equivalent to less features, less options, less flexibility. It’s bad to protect users from their own ignorance. This way they’ll remain ignorant. But what the hell, if we can sell more iPads because of this, why not?
I don’t mind power and flexibility, but the status quo in desktop UIs is power and flexibility at the expense of complexity. iOS is an attempt to create a simplified way of interfacing with a specific type of device (small form factor touch sreen). Tablets that essentially just mimic existing desktop UIs have been tried, repeatedly, no one buys them…
Im sorry but I don’t understand… Application centric intefaces do NOT hold files hostage – far from it. This argument is completely irrelevant.
I have the same options either way… The difference is when I open a tool in an application centric UI it knows what files in the file system I may want to work with, I don’t have to go find them… What hoops am I going through?
I don’t think you understand what an application centric UI is supposed to be… The applications don’t “contain” the files – they still can (and usually do) use a centralized file system. There are still file types, applications still register what types they support. The difference is strictly in the UI – The app presents the files it knows what you need to see at the time. It doesn’t “hide” them from other applications either – if it does its broken.
I’ll concede there are some categories of apps that do use proprietary means of storing files – just because some people do it wrong doesn’t invalidate the premise.
Your making my point for me…Hammers are tools – so are apps. What your saying is “which tool do I want to use today?”. Do you want to simply pick the right hammer and then grab one of your nails [application centric], or do you want to go find a particular nail first and then ask it which hammer it works with [document centric]? Your even saying you have “10 nails”, as if they are all the same – that is decidedly an application centric viewpoint (i.e. “I have a hammer, so everything is a nail”)
I’m sorry, but I’m not stretching the argument – your just wrong. Most real world scenerios ARE application centric.
Look at it like this. In any scenerio that involves a tool and an object the tool works on – you need both of them. Document centric interfaces are ideal when there is one and only one tool that is “right” for the object at hand. The problem is that is not and has never been true in computing – there are often multiple tools. Its really about the job you are going to do, not the tool – and in order to pick the right tool you need to know what you are planning to do first – the object itself doesn’t know enough to abstract that decision away in either system.
Application centric UIs are essentially conceding themselves to that truth, and push the decision of “what job do I want to do” into the top of the decision tree – the user knows what their tools do through experience and can pick them correctly every single time…
Its not black and white though – I’m not arguing one approach is inherently better than the other. The conventional document centric file management interface does work fairly well – I’m not saying it doesn’t work. I’m simply saying it isn’t the only way to do it, and you don’t necessarily have to lose any power or flexibility in the conversion from one paradigm one to the other.
I have no idea how to respond to that… You seem to have some notion that application cetric means you can only use the same tool to work on an object as the one that created it – it means nothing of the sort.
JOB <> FILE. A file is a thing. A job is an action to be performed. Tools perform actions ON thngs – one without the other is useless. Performing a job requires both, it is simply a matter of what order you choose them in.
My argument is that by choosing the tool that is right for the job first, the process of finding and choosing the file is simpler, because the application already inherently knows what kinds of files it works with…
Yes actually. It is the inconsistency that is really the whole point. An application should ideally present its “files” in the most optimal manner for the tool. I think that is essentially where our viewpoints collide – you think file management should be generalized so that it is consistent, I think that it should be non-existent because it isn’t actually necessary.
Ok – I won’t argue with that. Many applications are less than optimal.
So, instead of the user creating a folder to hold all the files related to a project, and then starting the apps they need and pointing the app at the files, you want to let the apps store files randomly around the filesystem and hope that the app devs were smart enough to create/use centralised storage such that the different apps can find files created/edited by other apps?
A “document-centric” setup does not mean “double-click on file and hope the right thing happens”. It means you organise your files however your want. And you start the apps yourself and load the files into the apps yourself.
An “app-centric” setup *without access to a filesystem* means you are completely at the mercy of the app developer for organising your files.
If you have a project that requires text documents, raster images, source code files, vector images, spreadsheets, audio files, etc, which is the better setup:
– everything grouped into a single folder named after the project, or
– everything strewn about randomly in some hidden filesystem where the only way to access a file is via the application that created it?
With access to a filesystem, you can group things based on the project. You can still limit yourself to only using one type of app per document type; or you can start different apps depending on your needs, and still point them all to one central project folder.
The problem with the iPad’s lack of filesystem access is how to get random files onto the damned thing in order to read/manipulate them. You have to use different apps for different filetypes and hope the app you want to use to read the file supports downloading the file somehow from somewhere.
If there was access to a filesystem, it would be easy to create a generic file transfer program to get files onto it, put them in nicely labelled locations (ie, folders) and then use multiple different apps to access those files.
We were going to use iPads in our board meetings to eliminate the use of paper agendas, minutes, handouts, etc. Except that there’s no easy way to get data onto an iPad over a wireless connection in such a way that different apps could access the data.
And now our schools that dove into the iPad fad are finding just how difficult it is to get data onto/off an iPad in any meaningful fashion over wifi, and are looking to get rid of them.
Which only works if you have 1 app per filetype, and breaks down when you want to have multiple different apps access the same file.
I think you’re a bit confused about what application- and document centric really means.
It has little to do with the file systems an sich.
App Centric -> Data is opened from the application. The application loads and formats the data accordng to its programming.
Examples : Almost everything thats out and in use.
Document centric -> Data is opened from the document. The document is opened and then loads the necessary feature components in order to display the data. Examples : The big graveyard of document centric systems and their half baked derivates no sane person uses anymore these days : HyperCard, OpenDoc, BeOS, OLE, …
Document centric systems proved to be the wrong way to go. While interesting for developers to have a set of reusable components to build a solution, it proved to be not really interesting for the end user. The idea of interchangable components lived on in application development though, with object-oriented programming and the idea of providing system-wide developer APIs.
Edited 2012-03-09 23:43 UTC
Close. Apps do not need to create centralized storage – they already have that. It just doesn’t work very well currently. I want the apps to query the file system for a view of all file types that it works with – this is how iOS should be designed to work in the first place. Granted, it does NOT really work this way now…
There are some technical challenges to this of course – but it is doable and would avoid having to add the need for centralized file management (which I think adds unnecessary complexity in order to fix problems we can be addressed in other way)
And before you say it… I’ll concede this is not ideal when it comes grouping files together by mental meta-data (i.e. a “project”). But it is a tradeoff, you lose that ability but what you gain is eliminating what is, most of the time, a totally pointless exercise (organizing files).
And I don’t want to organize files AT ALL. It is simply unproductive busy work that accomplishes nothing if apps can find the correct files instead of you having to keep track of them.
Yes. Exactly. I simply do not have a problem with that. If the app does it badly then I’ll certainly bitch about it – but it doesn’t have to be done badly. Why the hell should I care AT ALL about how various bags of bits are arranged in the file system – I simply want the system to give me the files I need when I need them – and when I need them is when I run an app that can use them. Sure there are things like Syncing and whatnot that may require me to delve into the nitty gritty details of where things are – but that should be the exception, not the rule. I’m not arguing for abolishing file management, simply abolishing _pointless_ file management.
Computers are good at organizing data – I think it is best to let them do it instead of me.
First off, its not the application that created it – it is the application that performs the job you are trying to do on the file. And yes, I absolutely think opening the application (which I will end up doing anyway) to get to the file I want to work with is a fair trade-off to make in order to eliminate any need for me to organize them in the first place.
Secondly, we are talking about a tablet… Yes, some people do something like what you are describing – in fact I do it ALL THE TIME – but I do that on a laptop. My Tablet? I want things simple, I have no need for project organization (and neither do most users of tablets).
That is a deficiency in the current implementation – not a deficiency in the concept.
…and that is exactly what the design of the OS is trying to avoid – needing all of that boilerplate…
Again, I’m not defending the state of the system at this time. iTunes sucks, so does the way syncing works (wireless or not) – I’m simply arguing that changing the nature of the interface paradigm is not the right way to fix it…
Why does everyone say that? It is simply not true at all. There is nothing stopping you from having 10 different apps that work on .jpg files or .mp4 files or whatever. My point is an app that (for example) is designed to do batch resizing/optimization of image files is going to present the files in a very different way than one that simply allows you to edit them one at a time. In fact, in almost every batch image manipulation program I have ever seen the centralized, consistent file manager gets thrown out the window and replaced by something custom, and that is on systems that support the document centric paradigm. Why? Because the paradigm breaks down in these kinds of scenerios.
In short: I don’t want to destroy the UI paradigm in order to fix problems that can be fixed WITHOUT destroying it…
You mention over and over that the current system is broken. You admit that iOS prevents you from easily copying files to and from other devices, and that it tends to cause app lock-in. And yet you seem perfectly complacent with this, all in the name of “the common good”. But there is no such thing as “the common good”, at least nothing that Apple is furthering. No, Apple made these design decisions on purpose, in order to lock you into their software, services and platform–not because they really want to make your life simpler. If they wanted that, hell, then they could easily have prevented this clusterfuck in the first place, and still made everything just as app-centric. So my question to you is: Why aren’t you mad at hell as them for messing up a good idea in such a horrible way???
Um… I am. But that isn’t what my original post was in regard to – it was a response to the notion of fixing it by adding a file manager…
I am a firm believer in the notion that an application centric UI that does away with conventional file management is the right way to do it – unfortunately the only example in current use that really even tries is iOS. It’s hard to argue the idea when the current flavor in popular use is so badly fubared…
You are arguing against your own point here. Your workflow on Windows is *exactly* the kind of thing that iOS prohibits. Because I can’t just create a raster file in one app on iOS, and expect that another app that handles raster files will have any idea that that file even exists! So basically iOS encourages a kind of app lock-in where once you start using a particular app, that’s *the only* way to work on those files. It’s like it used to be with vendor-locked-in file formats, except now it’s not even possible for another app to import/export those file formats, if the original app author doesn’t want them to. Just the thing for Apple to prevent people from switching away from their platform, by getting them locked into Apple-only software (iWork, iLife, etc)…
Apple’s already been going in this direction for years — iTunes and iPhoto by default insist on organizing your files for themselves, in the process making such a mess of the folder heirarchy that you are pretty much stuck with using those programs (on a Mac of course) into the foreseeable future, and incidentally making it a huge PITA to do the simplest things like *copy a bunch of photos or an album to a USB drive*. I use these apps with the “organize for me” turned off for just this reason. At least there is still that option. On iOS there is no equivalent.
We have taken it for granted for years now that it’s easy to just “give” people our data, without any middle man cloud service or what-have-you. It used to be called a floppy disk, then a CD, then a USB thumbdrive. But this simple concept–easily being able to copy and share hard data files with others–is being steadily eroded by iOS and its ilk. If I want to grab some data from a friend, with a laptop it used to be easy. Open up Explorer or Finder, Copy, Paste. Now with “the post PC era” there is no simple way to ask people to do such things.
No sir, I don’t like it. I want to have permanent, future-proof access to my data, not have it be locked up in some application (or cloud service) that may cease to exist 10 years down the road. I want to be able to share my files with others on a one-to-one basis without depending on some corporation to give me the privilege and keep backups of these things for me. I don’t want to be reliant on any large corporation just to get at files, files which represent my life, my thoughts, music, art, photographic memories, blood sweat and tears. Certainly not on Apple, nor on anyone else.
Edited 2012-03-10 03:00 UTC
Again – I know that. I’m not arguing that iOS does this “right”. Go back and read my posts… I’m arguing that application centric UIs do not and should not need centralized file management.
Just because iOS gets it wrong does not invalidate the premise, and simply adding a file manager does not fix the problems with iOS – it’s problems are self inflicted and have nothing to do with the absence of a file management UI.
Well, we’re speaking the same language now. But the thing is, the two do go hand in hand to some degree–because file management is the common interface to the world outside the device. And the fact remains, on Android I can just plug my phone into a USB port on any computer (using an industry standard cable no less), and copy data this way and that to my heart’s content. All because it has a file hierarchy that I can see. Is that so awful? Is it so wrong to offer a file manager, even if that’s not the *primary* or *recommended* way that things are done on the system? And what might you suggest as a solution to this “backwards compatibility” problem, without offering file management capabilities? Hmm??
It has nothing to do with having a clue or not. Apple treats Smartphones and Tablets as appliances. I don’t give a crap about how my TV/DVR/Blu-ray player etc manage their internal state as long as they do what I want them to do.
Even people with a clue sometimes just want an appliance. For example, data centers are moving to using appliances and SaaS. They want some thing that works and is managed for them to reduce complexity.
Most people want to create a document, email it, print it etc. They don’t really care to know about .docx, .sxw extensions and which app those things belong to.
Apple is catering to the 95% of the population that just wants to use their devices as appliances. Their sales numbers speak for themselves. People want their tablet to be appliances.
You could just as well argue that using C makes programmers lazy… Using assembly offers you just so much more control over the hardware!
The point I’m trying to make is that traditional file systems have served their time. Computing systems these days are powerful enough to just present us directly with whats interesting about them in the first place, the information enclosed within them, instead of generic looking icons, only having a file name to identify them with. Why continue take this detour which originated from when system speeds were defined in megahertzes and not just let the data speak for itself instead?
Why bother with managing folders, versions, etc, when the system can handle being powerful enough to do it for you? Even on previous paradigm machines, you generally don’t bother with managing your itunes libary, iphoto data, etc. When we check our mail in GMail, we don’t put things in folders like we used to in Outlook. We use search, and tags. We use things like metadata to categorize. Information can be in more than one “collection” on the same time, or be omitted, depending on the context.
On Post-PC devices, we don’t bother with opening named .vcard files in an address app manually to be able to call a person we saved his details from. Instead, the system handles the storage of this information transparantly and presents it to the user the way the application formats it regardless of it being the contacts, telephone, or third party applications like Viber.
You are right that computing is becoming more appliance-like. Arguing that removing any notion of a file system is a bad way to reduce this complexity and people should learn them instead is besides the point. Managing modern multigigabyte filesystems takes effort. If the user is going to do any effort, he’s better off in putting that effort towards a task he’s set up to do, not towards a task he needs to accomplish before he can start the task he’s set up to do. Technology is mainly valid if it succeeds in accelerating the task we are set out to do. Managing file systems isn’t one of them.
Nah…It is different and worse. Just like the one button mouse or the puck …
For that matter, IBM’s Job Control Language for OS/360 on mainframes was also application centric, even though it didn’t have a GUI – just punched cards. Want to delete a file? You have to run an app. That is why IEFBR14 was invented.
So I guess you can say that iOS is the JCL of the post-PC world!
but my Nokia N90 had a 259 PPI display 7 years ago..!
at 2.1 inches, twas small relative to a iPhone let alone this new tablet
so I get it’s still a relatively big thing, but still, just saying
Apple are great at marketing:
http://news.cnet.com/8301-13924_3-57392624-64/new-ipad-why-quad-cor…
They don’t talk about dual core CPU (because that’s where they are behind), they talk about quad core gpu (which doesn’t mean anything to anyone). Random (semi-savvy) consumer will just see that “ok, this android tablet is quad core, but so is the new ipad”.
I think quad core graphics are much more useful than quad core cpus on tablets.
Even on desktops, quad core CPUs are seldomly used to their full extent and more than often the extra cores are left twiddling their thumbs.
It makes much more sense using that SoC real estate to things that DO make a difference, such as graphical performance on a display which is now four times bigger.
I’d say that is generally true, modern tablets use the GPU to draw everything you see on the screen, whereas the CPU usually just handles the background stuff. This is especially obvious on older tablets and e-readers that don’t have an accelerated GPU.
At my part time job, I use a quad core i5 machine and it’s definitely overkill. Though I have had to run a compiler occasionally, I’m not a programmer by trade and I just use Dev-C++ which (to my knowledge) is not optimised for quad cores. On my dual-core AMD at home, however, I get a lot of use out of both cores as I tend to do a lot of movie transcoding and game playing.
I think that alone is why Apple chose to move to a quad core GPU; it may seem like overkill but at least they have the horsepower to push all those pixels and then some. I can imagine the train wreck of trying to support such a resolution on a dual core mobile GPU, given that my Nvidia desktop board at home struggles with 1600×900 in some games.
But quad core graphics doesn’t mean anything generally.
Do I have 480 core graphics since I have gf gtx 480?
I agree that fast graphics is important, but “quad core graphics” does not mean fast graphics (while quad core cpu generally does mean fast cpu). Does anyone outside Apple talk about quad core graphics?
Quad core CPU doesn’t mean fast cpu either. It means more throughput not faster. For carrying cargo a truck has more throughput than a ferrari.
Apple markets the quad core gpu as means to drive the massive pixels in the new high res display. The new GPU was linked to the retina display announcement. 4x the number of pixels means you need the horsepower and throughput to put them on screen.
Edited 2012-03-09 15:50 UTC
Well, it sort of does in practice.
Slow quad core cpu doesn’t make sense, since 4 cores is the “high end”. It’s cheaper to make a dual core cpu that runs at a faster rate than quad core cpu, so if you have a slow quad core cpu, nobody’s going to use it.
Yes, again, fast GPU is important.
Quick, can you name a devite with single core gpu? Or dual core GPU? How many GPU cores does a random ATI radeon have? Or Galaxy S 2?.
It’s just marketing. And apparently it’s fooling people even here ;-).
No but some people are taking it way out of context just for he sake of arguing and being anti-apple.
Apple just says “Hey we put this really high-res display on this device, if we didn’t bump up the processing power you would get laggy performance, which would be so unlike the iPad experience. So we increased graphics performance by adding more GPU performance to our already fast chip”
From the Apple site :”
The A5X chip with quad-core graphics drives four times the pixels of iPad 2 yet it delivers the same smoothness and fluidity iPad is known for. ”
Every single mention of the A5x is coupled with the Retina Display feature.
It appears you have utterly misunderstood basic english or are just trying to argue for the sake of arguing or both.
Every single ARM-based cell phone, except those using an MP2/MP4 variant of a PowerVR GPU. The Qualcomm Adreno series are single-core. The TI OMAP CPUs use single-core GPUs. The current-gen ARM-Mali GPUs are single-core (there are multi-core variants of the T-series of Mali GPUs coming down the pipe).
You need to stop thinking in terms of the PC architecture. Everything is different when it comes to ARM.
A GPU “core” is literally an entire GPU. Thus, an SoC with a “multi-core” GPU physically has multiple complete GPUs inside.
The Apple iPhone 4S and the Apple iPad2 (both using the A5 SoC) have dual-core GPUs, the PowerVR SGX543MP2 (the MP2 part means it has 2 complete copies of the SGX543 GPU onboard).
Most only have 1. There are a couple of super high-end cards that have 2 complete GPUs on the PCB, configured with CrossFire internally to act as a single GPU. “Stream processors”, “shaders”, etc are not “cores”. There’s only 1 GPU core in a standard Radeon GPU. Same with nVidia GPUs.
1 GPU, thus only 1 GPU core.
Edited 2012-03-12 06:18 UTC
*chokes*
You aware that graphics are massively parallelizable, right?
You have it the other way around. Graphics are much more parallelizable than general purpose computing. Hence the reason why you’ll find a multitude of processing cores on modern graphic processors.
Edited 2012-03-09 17:19 UTC
Yes. And “4 cores” does not mean anything in this light.
My gf gtx 480 has 480 processing units. How many does ipad have?
It does. When using the same cores, a quad core GPU will offer better bang for buck than a quad core CPU because graphics processing is very easily parallelizable.
Good luck lugging your GTX 480-equipped tablet around.
Gpu’s are not at all like general purpose cpus.
Let’s end this thread right here
In the realm of ARM, “quad-core” graphics literally means “4 GPUs”. Especially when it comes to the PowerVR GPUs. An SGX543 has 1 GPU “core” (ie, 1 GPU). An SGX543MP2 has 2 GPU “cores” (ie, 2 GPUs). And, an SGX543MP4 has 4 GPU “cores” (ie, 4 GPUs).
In the ARM GPU sphere, “multi-core” means “multi-GPU”. Think of it in terms of nVidia’s SLI and Ati’s CrossFire. Not in terms of processing units inside the individual GPUs.
Now we are on to something.
People that know about this PowerVR convention are not the ones being targeted by the Apple marketing speak in the launch comms. Apple is using it to steer the eye away from the fact that they only have a dual core CPU.
Nothing wrong with that, of course.
What’s wrong with “only having a dual-core CPU”? It’s not like there’s a heavy demand/need for multi-tasking on a tablet. Especially considering the lack of “true” multi-tasking in iOS. The biggest “need” in a tablet with this high of a resolution is graphics hardware. And that’s what the A5X comes with: the ability to pump pixels faster than just about any other ARM GPU out there right now (things will change with the Adreno 300-series and the ARM Mali-T-series).
Anandtech has some blog posts about demonstrations by … someone (TI?) with 4 videos playing at once, and not being able to stress the quad-core CPU inside.
Fast, power-efficient dual-core CPU with uber-fast GPU with all kinds of hardware decode capabilities is where it’s at for tablets (and, most likely, cell phones).
Will be interesting to see how ARM’s big.LITTLE architecture plays out near the end of this year/beginning of next year. High-frequency, dual-core Cortex-A15 matched with low-frequency Cortex-A7 (with some kind of next-gen GPU) should pretty much be nirvana for any mobile task.
But it still isn’t much of a differentiator – graphics are inherently parallel, also internally for GPUs, so one GPU might as well be faster than collection of four (heck, one can be as fast as, say, a thousand using the same amount of silicon; or it might very well end up faster, overheads and all)
It is, at most, an easy & digestible number to differentiate from the previous Apple SoC …so, yeah, marketing.
Edited 2012-03-14 23:54 UTC
4:3 screen and all movies are in 16:9
Black bars are good enough for everyone!
That’s a really good point, and a lot of wasted screen estate for a 10″ device.
I actual prefer the 4:3 Aspect Ratio.
The one and only thing that makes 16:9 desirable for a Tablet AR is video viewing, something I do occasionally but not often enough to care.
16:9 on a 7 inch screen feels ok to me (I have played with BB Playbooks for example), but on a 10 inch screen it is very uncomfortable to hold in portrait orientation – it is very top heavy. And in landscape it is too short vertically, I prefer the extra vertical real estate.
Anyone, to each his own. I get the argument, but I guess I am one of the few who like it the way it is.
You might try holding it by the side next time, that’s what wide-ish bezels of tablets are for, in a form-follows-function way… (nvm Apple would like to claim it’s their design patent or smth)
2048×1280 would make for 16:10 at 241PPI at 10″, 2560×1600 would break 300PPI at 10″ for 16:10.
Thanking you Nice post.
http://www.smsify.com/good-morning-sms.html
64GB a waste? What a silly concept. There is no such thing as too much storage, especially in a portable platform.