Apple Archive
This month, more than three dozen victims allegedly terrorized by stalkers using Apple AirTags have joined a class-action lawsuit filed in a California court last December against Apple. They alleged in an amended complaint that, partly due to Apple’s negligence, AirTags have become “one of the most dangerous and frightening technologies employed by stalkers” because they can be easily, cheaply, and covertly used to determine “real-time location information to track victims.” Since the lawsuit was initially filed in 2022, plaintiffs have alleged that there has been an “explosion of reporting” showing that AirTags are frequently being used for stalking, including a spike in international AirTags stalking cases and more than 150 police reports in the US as of April 2022. More recently, there were 19 AirTags stalking cases in one US metropolitan area—Tulsa, Oklahoma—alone, the complaint said. “Consequences have been as severe as possible: multiple murders have occurred in which the murderer used an AirTag to track the victim,” their complaint alleged. One plaintiff from Indiana, LaPrecia Sanders, lost her son after his girlfriend allegedly used an AirTag to track his movements and then “followed him to a bar and ran him over with her car, killing him at the scene.” It’s almost as if selling cheap trackers and turning every iPhone into a tracking device was a terrible idea. If only the creators had talked to any woman, ever.
That second agent proved quite capable, not only agreeing that the situation was strange, but also looking into issues on Apple’s side. Which led to the somewhat bizarre conclusion of this story: after perhaps 20 minutes on the phone, he seemed to hit on something. I heard him laugh and say something along the lines of “that explains it” and then, with my consent, put me on hold. When he came back, he said—and I’m not exactly quoting, but close enough: “I’m sorry, I can’t tell you any more than this, but all your services should be back up pretty much exactly 12 hours after they went down.” Cloud computing is bizarre. Cloud computing at Apple – doubly so.
One aspect of the jailbreak scene that always seemed like black magic to me, though, was the process of jailbreaking itself. The prospect is pretty remarkable: take any off-the-shelf iPhone, then enact obscene rituals and recite eldritch incantations until the shackles drop away. The OS will now allow you to run any code you point at it, irrespective of whether the code has gone through Apple’s blessed signing process, paving the way for industrious tweak developers like myself. A few weeks ago, I got a hankering to remove this shroud of mystery from jailbreaks by writing my own. One caveat: the really juicy work here has been done by my forebears. I’m particularly indebted to p0sixninja and axi0mx, who have graciously shared their knowledge via open source. The fact this isn’t a switch to flip in iOS somewhere is idiotic and will soon come to an end thanks to the EU, but at least it enticed some very creative and gifted souls to learn and experiment.
It seems the new iPhone 15 Pro is having overheating issues, and while I normally don’t really care and don’t mention this sort of nonsense, I found Apple’s response to the issue… Peculiar. Furthermore, Apple tells 9to5Mac that recent updates to certain third-party apps are causing them to overload the system. The company says it’s working directly with those developers to fix the issues. According to Apple, some of the apps overloading the iPhone CPU and causing devices to overheat are Asphalt 9, Instagram, and Uber. Instagram issued a fix for the problem on September 27, Apple says. Apple designs and builds the SoC, the thermal system, the outer casing, the operating system, the APIs, and is the gatekeeper for every application that runs on an iPhone – and yet the company still blames third party developers? How is it even possible that any of these applications can cause unexpected overheating in the first place, and how, if the App Store review process is put in place to protect users, did nobody at Apple catch this during the review process? If they can’t even detect and stop applications that can physically damage your iPhone, how on earth can anyone trust them to stop malware, spyware, and other crapware? I can’t believe people still fall for this.
Cory Doctorow: Right to repair has no cannier, more dedicated adversary than Apple, a company whose most innovative work is dreaming up new ways to sneakily sabotage electronics repair while claiming to be a caring environmental steward, a lie that covers up the mountains of e-waste that Apple dooms our descendants to wade through. Why does Apple hate repair so much? It’s not that they want to poison our water and bodies with microplastics; it’s not that they want to hasten the day our coastal cities drown; it’s not that they relish the human misery that accompanies every gram of conflict mineral. They aren’t sadists. They’re merely sociopathically greedy. Tim Cook laid it out for his investors: when people can repair their devices, they don’t buy new ones. When people don’t buy new devices, Apple doesn’t sell them new devices. A few weeks ago, when news broke that Apple had changed from opposing California’s right to repair bill to supporting it, and the entire tech media was falling over itself to uncritically report on it, I instinctively knew something was up. Supporting right to repair was so uncharacteristic of Apple and Tim Cook, I just knew something was off. It turns out I was right. Instead of relying on the lack of right to repair laws, Apple is simply making it so that using any parts not approved by Apple in a repair would make your Apple device not function properly. They do so by VIN-locking, or parts-pairing as it’s called in the tech industry, parts, and if the device’s SoC detects that an unapproved repair is taking place, the device simply won’t accept it, even if genuine Apple parts are being used. Trying to circumvent this parts-pairing violates the DMCA – and the DMCA is federal law, while California’s right to repair bill it state law, meaning the DMCA overrules it. Doctorow lists various other things Apple does to limit your ability to repair devices, such as claiming to “recycle” devices when you return them to Apple, only for the company to shred them instead to prevent their parts from making it into the repair circuit. Apple also puts tiny serial numbers on every single part, so that even when devices are scrapped for parts, usually in Asia, Apple can work together with US Customs to intercept and destroy these fully working parts when they enter the US. So, Apple supporting California’s right to repair bill is entirely and utterly meaningless and hollow. It’s all for show, for the optics, to mislead the gullible 20-somethings in the tech media. I knew something was up, and I was right.
iOS 17 and iPadOS 17 offer several welcome improvements, tweaks, and new features. They also continue two trends that have dominated recent updates for both platforms: the expansion of widgets giving modular access to functions from a variety of apps, and on-device intelligence that improves search, recommendations, and more. This year’s update pushes both platforms forward just a bit—but not enough that too many people will notice. A more complete feature set will roll out over time, though, so by the end of the cycle, we’ll have seen a nice range of additions. Honestly, with how mature iOS (and Android, for that matter) have become, I don’t think it’s a bad thing that we’re seeing more iterative releases bringing polish and nips and tucks instead of massive feature overhauls and additions nobody is asking for.
iOS 17 expands on last year’s Lock Screen updates with the addition of interactive widgets and StandBy, a new feature that turns the iPhone into a mini home hub when it is charging. You can now see voicemail transcriptions in real time, and leave video messages in FaceTime. FaceTime also now works on the Apple TV with tvOS 17. Apple also released watchOS 10, tvOS 17, and HomePod 17 Software. Take a guess which one is the unwanted child.
Before the ubiquity of the Internet, before WiFi, even before Ethernet was affordable, there was the LocalTalk physical layer and cabling system and its companion suite of protocols called AppleTalk. A network ahead of its time in terms of plug-and-play, but not quite as fast as 10mbit/s Ethernet at 230.4 kbit/s. This article goes into great detail about setting up an AppleTalk network today.
Macs have brought a great deal to us over the years: desktop publishing, design, image editing and processing, multimedia, and more. One of the few fields where they have failed is programming, despite many attempts. Here I look back at some of those opportunities we missed. It’s a bit of an only mildly related aside, but even though I personally would love to get into programming in some form, it’s actually a lot harder to get into than a lot of programmers tend to think. Learning how to program has big “the rest of the fucking owl” energy in that the most basic of basic concepts are relatively easy to grasp, but the leap from those very basic concepts to actually using them for something useful is absolutely massive and fraught with endless pitfalls. Many, many have tried to bridge this massive canyon, and Apple sure has tried numerous times as this article illustrates, but other than just starting at a young age and never losing interest and never standing still for too long, it seems like nobody has found an actually good, reliable way of teaching latecomers how to program.
At WWDC earlier this year, Apple announced that upcoming versions of iOS and macOS would ship with a new feature powered by “a Transformer language model” that will give users “predictive text recommendations inline as they type.” Upon hearing this announcement, I was pretty curious about how this feature works. Apple hasn’t deployed many language models of their own, despite most of their competitors going all-in on large language models over the last couple years. I see this as a result of Apple generally priding themselves on polish and perfection, while language models are fairly unpolished and imperfect. As a result, this may be one of the first Transformer-based models that Apple will ship in one of its operating systems, or at least one of the first that they’ve acknowledged publicly. This left me with some questions about the feature. Jack Cook did some digging into this new feature and the language model it uses, and came up with some very interesting findings. He also details his process, and of course, the code he wrote to do all of this is available on Github.
A major change introduced by iPadOS 17 that is going to make video creators and gamers happy is support for UVC (USB Video Class) devices, which means an iPad can now recognize external webcams, cameras, video acquisition cards, and other devices connected over USB-C. I started testing iPadOS 17 thinking this would be a boring addition I’d never use; as it turns out, it’s where I had the most fun tinkering with different pieces of hardware this summer. Most of all, however, I did not anticipate I’d end up doing FaceTime calls with a Game Boy Camera as my iPad Pro’s webcam. This is amazing.
Over the past year or so, I’ve been working with other BlueSCSI developers to add Wi-Fi functionality to their open-hardware SCSI device, enabling Wi-Fi support for old Macs and other vintage computers going back some 36 years. This is my Macintosh Portable M5126. It’s very Macintosh and hardly portable. For some reason I’m using it on my lawn reading the Wi-Fi Wikipedia article over Wi-Fi through my Wikipedia application for System 6, with my Wi-Fi Desk Accessory showing it connected to my “!” network with meager signal strength. With PCB production having become relatively commoditised, we’re seeing so many pieces of hardware designed specifically for retro computing, and it’s great. Small audiences is no longer a limiting factor in making things like this available, and I’m here for it.
The original iMac entered a computing world that was in desperate need of a shake-up. After the wild early days of the personal computer revolution, things had become stagnant by the mid-1990s. Apple had spent a decade frittering away the Mac’s advantages until most of them were gone, blown out of the water by the enormous splash of Windows 95. It was the era of beige desktop computers chained to big CRT displays and other peripherals. In 1997, Steve Jobs returned to an Apple that was at death’s door, and in true Princess Bride style, he rapidly ran down a list of the company’s assets and liabilities. Apple didn’t have a wheelbarrow or a holocaust cloak, but it did have a young industrial designer who had been experimenting with colors and translucent plastic in Apple’s otherwise boring hardware designs. The original iMac is simply a delightful machine. I vividly remember that the reception and administrative workers at the orthodontic department at the hospital in Alkmaar used them, and teenage me would peek past the reception desk to catch glimpses of the colourful machines. I still love the original iMac.
Apple’s M2 Ultra powered Mac Pro is the final step in their Apple Silicon transition. But without GPU support or meaningful expansion, is it worth nearly double the price of a comparable Mac Studio? It really seems like high-end computing is simply no longer possible whatsoever on the Mac. The Mac Pro is a joke, the memory limits on the M2 chips make them useless for high-end uses, there’s not enough PCI-e lanes, the integrated GPUs are a joke compared to offerings from AMD and NVIDIA, and x86 processors at the higher end completely obliterate the M2 chips. At least ARM Macs use less power, so there’s that. But then, if you have to wait longer for tasks to finish – or can’t perform your tasks at all – does that really matter on your stationary, high-end workstation?
There’s been a lot of concern recently about the Web Environment Integrity proposal, developed by a selection of authors from Google, and apparently being prototyped in Chromium. There’s good reason for anger here (though I’m not sure yelling at people on GitHub is necessarily the best outlet). This proposal amounts to attestation on the web, limiting access to features or entire sites based on whether the client is approved by a trusted issuer. In practice, that will mean Apple, Microsoft & Google. Of course, Google isn’t the first to think of this, but in fact they’re not even the first to ship it. Apple already developed & deployed an extremely similar system last year, now integrated into MacOS 13, iOS 16 & Safari, called “Private Access Tokens“. Ten bucks this bad thing Apple is already shipping will get far less attention than a proposal by Google.
Apple says it will remove services such as FaceTime and iMessage from the UK rather than weaken security if new proposals are made law and acted upon. The government is seeking to update the Investigatory Powers Act (IPA) 2016. It wants messaging services to clear security features with the Home Office before releasing them to customers. The act lets the Home Office demand security features are disabled, without telling the public. Under the update, this would have to be immediate. I wonder if Apple would actually follow through with something like this, or if they’re only looking for a token concession so they can claim they’re still in the clear and do nothing. Interesting, though, that when the Chinese government comes calling, Tim Cook drops his “privacy is a fundamental human right” shtick real quick, but when the government of a western country comes calling, it’s a lot of rah-rah. A spine is clearly not very expensive.
Apple is officially releasing the first public betas of iOS 17, iPadOS 17, watchOS 10, and macOS 14 Sonoma today, a little over a month after releasing the first developer betas at its Worldwide Developers Conference. I have to say, Apple is doing a great job with their public beta access. It’s easy enough that it’s accessible, but not so easy you’ve got millions of people running unstable software. Considering the number of platforms they have to support – that’s no easy feat.
Apple today announced the availability of new software tools and technologies that enable developers to create groundbreaking app experiences for Apple Vision Pro — Apple’s first spatial computer. Featuring visionOS, the world’s first spatial operating system, Vision Pro lets users interact with digital content in their physical space using the most natural and intuitive inputs possible — their eyes, hands, and voice. Starting today, Apple’s global community of developers will be able to create an entirely new class of spatial computing apps that take full advantage of the infinite canvas in Vision Pro and seamlessly blend digital content with the physical world to enable extraordinary new experiences. With the visionOS SDK, developers can utilize the powerful and unique capabilities of Vision Pro and visionOS to design brand-new app experiences across a variety of categories including productivity, design, gaming, and more. I’m genuinely interested to see if third party developers can come up with better use cases for Apple’s VR headset than Apple itself did.
From CrossOver’s blog: Apple revealed their new Game Porting Toolkit today at WWDC. This Toolkit is designed to allow Windows game developers a way to easily and quickly determine how well their game could run on macOS, with the ultimate goal of facilitating the creation of Mac game ports. We are ecstatic that Apple chose to use CrossOver’s source code as their emulation solution for the Game Porting Toolkit. We have decades of experience creating ports with Wine, and we are very pleased that Apple is recognizing that Wine is a fantastic solution for running Windows games on macOS. We did not work with Apple on this tool, but we would be delighted to work with any game developers who try out the Game Porting Toolkit and see the massive potential that Wine offers. So, Apple basically repackaged Wine. Interesting they’re going the same route as Valve, just less open about it, and since it’s not core to the company’s business, it probably won’t be nearly as good and aggressive at getting new games to work as Valve’s Proton does, both through Valve itself and countless modified versions of Proton from 3rd parties.
After years of speculation, leaks, rumors, setbacks, and rumblings of amazing behind-the-scenes demos, Apple has made its plans for a mixed reality platform and headset public. Vision Pro is “the first Apple Product you look through, not at,” Apple’s Tim Cook said, a “new AR platform with a new product” that augments reality by seamlessly blending the real world with the digital world. The headset will start at $3,499 and be available early next year. That puts the device in an entirely different class than most existing VR headsets, including the $550 PSVR2 (which requires a tethered PS5 to use) and the $500 Quest 3 that was just announced for a fall release. The technology on display here is amazing, but the presentation itself, including Apple’s proposed use cases, were thoroughly dystopian. When you’re wearing it, a video feed of your eyes can be shown on the outside display when talking to someone next to you, which looks like pure nightmare fuel to me. Apple also showed a birthday party where the dad was wearing this thing while his daughter and her friends were blowing out the candles – which, as a dad… Just no. Don’t wear the creepy glowing robot face during your daughter’s birthday party. Other than that, since it has no controllers, the gaming proposition consisted of regular “2D” games projected on a screen, so you can’t play popular VR games like Beat Saber or Gorilla Tag. Since the device tries very hard to mimic a traditional user interface in VR, many of the renders shown off during the presentation consisted of floating windows. Videoconferencing consisted of floating windows with camera feeds from the participants, for instance, while the VR user’s face is rendered onto an avatar. Showing multiple application windows floating around you definitely looks very cool, but whether or not that’s actually a pleasant user experience? I don’t know. But the biggest problem with the whole presentation is that Apple has not actually showed off anything tangible. Everything shown off during the keynote was fake – prerendered special effects layered onto video, and since nobody has received any hands-on time with the actual hardware, and thus nobody outside of Apple has seen the real user interface in action, we actually have no idea how it will actually look, feel, and perform. This is the AR/VR equivalent of using prerendered cinematics to create hype for a video game, and we should know better by now. If there’s one company that can convince people to spend $3500 to strap an isolating dystopian glowing robot mask onto their faces it’s Apple, but I still have a hard time believing this is what people want.