AnandTech reviews the ASUS Zenbook UX305.
Overall, even with the knocks against it, this is a heck of a device for just $699. A Core M processor, which allows a fanless and therefore silent device, but still offers good performance, and much more performance than any other CPU which would allow for a fanless design. 8 GB of memory standard. A 256 GB solid state drive standard. A 1920x1080p IPS display, once again standard. ASUS has really raised the bar for what someone can expect in a mid-range device.
I honestly cannot believe that you can buy this much laptop for that kind of money these days – and unlike other cheap laptops, this actually isn’t a piece of crap, but a proper, all-metal laptop that doesn’t look like two stoeptegels slapped together.
Good laptop if having a slow CPU doesn’t bother you.
A 2Ghz modern i5 is slow to you? You should compare with other cpu’s that use the same amount of power like an Nvidia X1 or Apple A8X.
It’s very likely that it’ll be slower than you might expect in practice.
Its default/base frequency is only 800 MHz.
2 GHz is the max. turbo frequency. If only one core is doing anything, it can probably run at 2 GHz for about half of a second before the temperature forces it back down.
Basically; for simple things (e.g. running a word-processor where you get a tiny burst of CPU activity when you press a key but it’s doing nothing almost all the time) it’d be very nice/fast; and for anything involving heavy processing it’ll be relatively slow.
– Brendan
Okay, time to roll out the “MegaHertz Myth” again…
A 1GHz Core-m is not comparable to a 1 GHz PIII. I have an 800 MHz PIII machine, a 1.2 GHz VIA C7M, a 1.6 GHz Atom, numerous C2Ds, 3.4 GHz i7-4 and right now I’m using a Celeron N2840 — which according to benchmarks is not far off an i3 for the basic things.
Even at 800 MHz, I’m sure it’s the equivilent of a 2 GHz Core2Duo or similar processor, and those are still fine for web browsing and youTube.
Hi,
If you’re attempting to say an 800 MHz Haswell CPU is “just as fast” as a 2 Ghz Haswell CPU, then you’re an idiot. The “MegaHertz Myth” only applies when you’re comparing different micro-architectures (e.g. with different “instructions per cycle”), not when you’re comparing the same micro-architecture.
If you were smart you’d start questioning how much CPU power people actually need (and whether the additional power consumption of a faster CPU is justified). For things like basic web browsing and word processing even a slow Haswell is probably more than adequate. For something like Crysis? Nope.
– Brendan
Nice thing calling your peer an idiot, especially because of something he never did. He mentioned about 5 different micro-architectures, for God’s sake…
Granted, since he has all these different machines, it would have been nice to see some numbers; for example the P-III against the C2D (I am writing this from a C2D laptop, whose minimum frequency is 800 MHz, so it should definitely be possible).
I have a 2GHz VIA C7, a 1.6 GHz Intel Atom330, a 1.6 GHZ AMD E-350 and a 1.6 GHz AMD A8-3500M, they works really well. I am wondering how you can cope with computers being considered slow below 2 GHz.
10 years back we made marvels with just 1 GHz, and now everyone complains because its just enough for browsing the (static) web and wording ? And you find this “normal” ?
I remember my Atari ST with 4 MB of memory, 8 MHz, able to go on the web, do publishing (Calamus SL) and all kind of stuff, with graphical interface built in ROM (192 then 256 KB)
Now everything is between 500 and 1000 more, but we hardly do so much more. There’s a big waste somewhere.
I guess you answered somebody else, not me. I am using a 2007 C2D @ 1.8GHz maximum, and I completely agree with you. I never had an Atari (C64 -> 386), and certainly no Internet at the time, so I cannot be sure how it compares to your use-case, but all I can think of is that the advances we made (internationalization, fonts, multitasking, protected memory, OO, high level languages, complicated document formats, etc.) somehow add exponential complexity? Complete guesswork, obviously, because part of those stuff should not incur runtime costs, changes to documents are often local, etc…
Of course, when looking at the case of scientific computing, one can really see the difference. So the potential is definitely there…
internationalization : unicode is 16 or 32 bits wide, often utf-8 which most of the time is not wider than ascii.
fonts : ttf of dasfont not quite much overhead using cached rasterized glyphs.
multitasking : even the cpc or c64 ware able to do multitasking, lowgrade mcu too, so it’s not an issue.
protected memory : done by hardware (p)mmu, completely transparent software wise.
OO : yeah, probably the reason why things gone wrong when bad coders were given the C++.
high level languages : lisp exists since 1960, Geneva made lisp machine that were as fast (if not faster) than traditional computers.
complicated document formats : if html is considered complicated due to the XML-like hierarchy, that’s kind of sad to consider we reached the machines’ limits with text documents.
etc. : sound had their dedicated sound cards, 3D games their 3D cards, network their network cards, almost everything is offloaded from the CPU, yet most of the PC sold today crawls like an under clocked 386 with little memory.
Unacceptable.
Edited 2015-03-26 14:41 UTC
Kochise,
Haha, I hear you! Hardware gains have been amazing, yet software devs have become complacent. Optimization is viewed as frivolous and unnecessary. There’s barely any demand for optimization skills because as long as the hardware compensates, that’s good enough.
We have more & faster CPUs, tons of cache & RAM & disk, loads of bandwidth. These days software can be 20-50MB+ and people don’t even flinch. The increase in media is one thing, but even simple drivers and apps that don’t need any media are heavyweights. They used to fit on 1.44 floppies.
I accept that times change and things evolve. I’m left pondering whether there will ever be a serious push back towards efficient software, or if we are on a path that never needs to go back there.
Edited 2015-03-26 17:47 UTC
In the beginning there was plug boards and switches, and someone said: “Lets have assembly, keyboards and punch cards!”.
But things were hard: “Lets have high-level languages, compilers, libraries and screens!”.
Still too tough to swallow: “Lets have software interpreted languages, GC, OO and virtualization!”.
We are never satisfied: “Lets have javascript and in-browser applications and embed it left and right!”.
And the end of times ominously loomed on the horizon.
Edited 2015-03-26 18:32 UTC
You know, empires en civilizations made wonders with this :
http://en.wikipedia.org/wiki/Arithmetic_rope
http://en.wikipedia.org/wiki/Abacus
http://en.wikipedia.org/wiki/Slide_rule
Really…
I wonder if people can still use these “outdated” technologies that were used to raise pyramids, castles and cathedrals. Or do some simple math. Or more advanced calculations.
acobar,
Haha, cute. I’ve never actually seen a punch card, done plenty of assembly though. I like high level languages. A lot of toolkits exist to enable programmers to build more sophisticated software than they could code up themselves. It enables developers to do much more without a big learning curve, which is good in obvious ways. There are negatives too, obviously the bloat we’re talking about, but also more subtle things. The demand for cheap programmers trained only to use automatic toolkits is decreasing demand for more experienced programmers who have much more experience with security, efficiency, scalability, etc.
This has real-world ramifications, how many have been on a call where the employees on the line are complaining about how slow their systems are? “I’m waiting for the computer the retrieve the information”. I believe this to be the norm rather than the exception. Even in this age of being spoiled with crazy fast hardware/networks, why do so many places have these problems? From my experience, the answer is usually that the companies failed to anticipate the need for optimization (network/database/software/etc). Once a system is built and in production being used, it’s often too late & too expensive to fix, and they end up throwing more hardware at the problem. Employees on the front-lines are stuck living with a poorly engineered solution with layers upon layers of inefficient formats, protocols, and algorithms that end up increasing the system stress and workload a hundredfold, not to mention lost employee productivity.
Heck, any engineer worth his salt would likely have been able to build a better performing system on hardware from a decade ago, except for the conditions under which we’re working: work faster, cut costs, rush to deliver without regards to quality or even security… It’s like this everywhere I’ve worked and it’s highly frustrating. *I* can see the importance of making software that performs well, especially when it’s going to be used day in and day out, but most companies don’t seem to value it like I do.
But down voting you and me for stating the obvious lack of consideration for quality software is the norm.
Kochise,
Let them down-vote, we’re still right about modern software being inefficient The only thing I find regrettable is that we don’t know what it was they actually objected to.
Maybe we should petition Thom (and co?) to bring OSAlert into the modern age – and increase the bloat obviously!
All comment votes both +ve and -ve should be accompanied by a wikipedia Discussion-page-style justification backing up the direction of their vote!
And then ebay-style those long-standing (up-standing?) community “members” with enough “stars” could either allow or deny, the Up or Down vote.
The elitist satisfaction would I’m sure give us all a warm glow. But we’d be “right”
And we’d be modern, bloated, fat and up-to-date
/ssssssssssss
mistersoft,
Haha, yes you’ve mastered the art of irony!
Not sure if people would want this or not, but at least it would help explain the downvotes. By saying something that strikes a nerve, then you can expect to be downvoted, even if it’s the obvious truth. It makes sense, but sometimes you scratch your head wondering how someone got offended. We all have our unique perspectives though. For the most part OSAlert discussions go remarkably well compared to some other sites on the internet, probably because of the demographic here.
Heh. I installed the drivers for my Brother printer. It all weighed in at nearly 1000 megabytes. Talk about bloat!
WereCatf,
My god man… I haven’t seen anything that bad, but then most of my hardware is several years old now.
My 2004 epson printer had an 8MB setup, it pops up a dialog when it prints.
A TP-Link powerline utility here is 30MB, all it really does is find/configure the powerline ethernet devices.
My logitech mouse driver is 70MB, it probably has the worst functionality/footprint ratio I’ve seen.
Tip (not for users, but a poweruser like you should have no problem with this): Don’t install running the setup. Unpack the zip/exe/cab, find the inf and install it.
Or if you want to be slightly lazier: Install once, then use dism /online /export-driver /Destination:X:\MySavedDrivers, then uninstall and now you can use the pure drivers from your emergency USB-Drive
Kochise:
The 1GHz-barrier has been broken in the year 2000, so you’re five years off. Thus it is 15 years back.
Kroc:
That is rubbish, there hasn’t been that kind of evolution in processor IPC since the C2D, not with a 150% gain
I repeat, 10 years back, in 2005, 5 years after AMD had crossed the GHz barrier with their Athlon, people made wonders with just 1 GHz. Windows 2000 is a breeze on it.
Hi,
Obviously when I said it’s “relatively slow” I meant it’s slow relatively to current generation CPUs (e.g. Haswell and Broadwell). Comparing it against old CPUs isn’t sane (e.g. why not compare it against an ancient Z80 and pretend it’s extremely fast?).
Core 2 Duo was almost 10 years ago (2006)!
Note: I don’t know what the minimum frequency for the Intel Core M-5Y10 is (e.g. when it’s in its most power efficient P state). 800 MHz is its nominal frequency, not its minimum frequency.
If you want benchmarks, maybe try: https://cpubenchmark.net/cpu.php?cpu=Intel+Core+M-5Y10+%40+0.80G…
– Brendan
So why not use a C2D? I expect laptops and CPUs to be better after 9 years.
And web browsing, normal gaming, listening to music etc. Basically what you normally do on a computer.
Rendering, heavy simulations, compiling. Stuff that should be done on a 95watt desktop cpu.
Edited 2015-03-26 14:14 UTC
An 800 Mhz CPU. Just look on benchmarks and stop trolling: http://www.notebookcheck.net/Intel-Core-M-5Y10-SoC.125673.0.html
Compare it with a regular core i5 mobile CPU.
You should look up those words. You are using them wrong.
Looks cool. Screen too small to be useful.
IMHO, anything below 17″ is too small. Even 17″ is barely acceptable
^5!
I’ve had to go out and buy a new laptop recently, too.
I bought the last 17″ mbp on stock when they discontinued it, but due to the nature of my job, the thing is pretty worn out now.
I was pretty annoyed that it wasn’t easy to find a good 17″ laptop anymore. It was either budget stuff that tries to look big because big screens sell, or gaming laptops that look like advertisement lights for a “variety” show.
I ended up picking up an acer aspire nitro v17 gaming laptop, because it doesn’t really look like a gaming laptop. It sure ain’t no mbp build quality wise, but it’s pretty fast and runs linux well once you swap out the wireless card. It has a nice 17.3″ screen with a resolution and density that actual human beings can appreciate.
Glad to see some people actually still use laptops for work around here!
Huh, I’ve always seen 15″ laptops as annoyingly big and 14″ as a workable compromise. At 17″ you start running into too many limitations on where you can unfold it and what bags you can carry it in, IMHO. The extra weight isn’t ideal either.
Ofc, I can imagine uses where you really want the extra screen size – it’s just not for me.
Edited 2015-03-26 11:31 UTC
My previous job I had a 17″ Lenovo – it was like carrying around a surf board. I’d rather have a desktop in the office and something ultra portable for off site. 17″ is too big.
Edited 2015-03-26 11:57 UTC
I have one of them for about a week and Ubuntu 15.04 daily works without any problems (I was realy surprised how painless the instalation was). Battery life +-5.5h mixed wifi browsing + music / Sublime + terminal + haskell builds + music (brightness about 25%, definitely enough for indoor).
I am going to see how it works with Arch.
Is everything working under Linux? I have my eye on the XPS 13 but still too many issues left to run Linux and apparently the dynamic contrast can’t be switched off.
This laptop still has most of the good stuff and is slightly cheaper
The only issue I discovered is, that keys to change brightness don’t work by default. (You can map them manually, probably; Brightness setting in the Ubuntu setting utility works without any work.)
Sound, screen, touchpad, wifi, close lid to sleep… all of this just works.
(The XPS 13 issues are quite a paradox. They plan First class Linux laptop but choose wrong components. XPS 13 is definitely more powerful and more expensive alternative. I like the feel of UX305 more but XPS13 is more logical choice when issues are resolved and when money is no issue.)
We’ve got a couple of Macbook Air 13.3″ at work and wanted an equivalent Windows Ultrabook. It basically comes down to either the Dell XPS 13 or this Asus UX305, but we need Win 8.1 Pro on them (mainly so they can join the work domain), which bumps up the price somewhat for both.
XPS 13 comes out at 729 GBP + VAT (874.80 GBP inc tax) and the Asus UX305 at 654.12 GBP + VAT (784.94 inc tax), so the Asus wins on price in the UK – this is including shipping with Win 8.1 Pro remember.
Those UK prices do make me wince compared to the US ones (and note the UK models skimp out with a 128GB SSD, though we mostly use networked drives so not an issue) – almost makes me want to use a white box seller because of the cheaper price and far superior customisation of their equivalent Ultrabook:
http://www.pcspecialist.co.uk/notebooks/lafite/
Edited 2015-03-26 11:10 UTC
…FIRED!
No backlight?
Really?
This is one way to miss the mark:
“Perfect Ultrabook of the year…”
Seems like the perfect machine for most people.
Thin, light and very good batterylife.
Enough screen and resolution
No skimping on memory or SSD either
Looking good and having enough ports (even including an ethernet adapter in the box)
And all the while entirely affordable.
I am wondering how other websites are going to put this against the MacBook that they all wrote so much about. This is basically the same thing but with the ports and without the costs
Extremely strange how the North-America version is so much nicer than the version for “the rest of us”.