Geek stuff Archive

OpenAI announces GPT-4

We’ve created GPT-4, the latest milestone in OpenAI’s effort in scaling up deep learning. GPT-4 is a large multimodal model (accepting image and text inputs, emitting text outputs) that, while less capable than humans in many real-world scenarios, exhibits human-level performance on various professional and academic benchmarks. For example, it passes a simulated bar exam with a score around the top 10% of test takers; in contrast, GPT-3.5’s score was around the bottom 10%. We’ve spent 6 months iteratively aligning GPT-4 using lessons from our adversarial testing program as well as ChatGPT, resulting in our best-ever results (though far from perfect) on factuality, steerability, and refusing to go outside of guardrails. “Artificial intelligence” companies are iterating quickly now. I’m definitely looking forward to the new memes based on GPT-4.

Unicode 15.0.0 adds more eyes to

The character “” (U+A66E) is being updated in version 15.0.0. Because it doesn’t have enough eyes. It needs to have three more eyes. This character is rare. Very, very rare. Rare enough to occur in a single phrase, in a single text written in an extinct language, Old Church Slavonic. The text is a copy of the Book of Psalms, written around 1429 and kept in Russia. Basically, in some old Slavic languages, authors would stylise the “O” in their word for eye (“к”) by adding a dot in the middle to make it look like an eye. If there were two eyes, two of these characters would be joined together (“чи”). The final evolution of this character was “”, used only once in human history, in the phrase “серафими многочити”, which translates to “many-eyed seraphim”. Here’s how this relates to Unicode: the person who originally added this character to Unicode made a mistake, and didn’t count the number of eyes correctly. There should be ten eyes, not seven. This error was discovered in 2020, and now it has been corrected. Awesome.

Electronic Catan LCD tiles

A collection of hexagon tiles that magnetically snap together to create a Settlers of Catan board of any shape or size. Each tile features a large round LCD and a custom magnetic pogo connector on each edge. Linking up a bunch of tiles creates a position-aware partial mesh network. This is just excellent. I want this.

Nreal Light review: hardware is only half the battle

Nreal’s Light sunglasses, which Verizon will start selling later this month, are one of only a few consumer-focused augmented reality headsets. They’re an impressive technical feat: small for an AR or VR product, comparatively affordable at $599, and capable of full-fledged mixed reality that projects images into real space, not just a flat heads-up overlay like the North Focals. Unfortunately, Nreal’s software doesn’t fulfill its hardware’s promise. The Light is hampered by a bare-bones control scheme, a patchy app ecosystem, and a general user experience that ranges from undercooked to barely functional. Nreal may well have shown us the future of AR, but it seems disinterested in making the experience very pleasant. Everybody is talking about AR glasses being the next big thing after smartphones, but to me they feel deeply dystopian and creepy – for very, very little benefit over using a smartphone. I’m sure AR glasses will be very welcome in countless professional settings, but I’m not so sure it will be embraced by general consumers in everyday life.

Students don’t know what files and folders are, professors say

Strange as it may seem to older generations of computer users who grew up maintaining an elaborate collection of nested subfolders, thanks to powerful search functions now being the default in operating systems, as well as the way phones and tablets obfuscate their file structure, and cloud storage, high school graduates don’t see their hard drives the same way. As anyone who has had to sift through a relative’s landfill organization technique can attest, most people shouldn’t be in charge of organizing their files. The machine should sort files based on metadata about the file, and people can select options and provide search criteria to filter the data. We’re power users here, but even I rely on fd, locate, and ripgrep quite often. I guess this most surprising part is this is surprising. Computing is application focused. People open MS Office Word, Apple Pages, or LibreOffice Writer; they don’t open a file. Operating systems don’t have pluggable extensions which let people manipulate various file types; they have applications which run on them. On top of that, files and folders are a meta-construct so humans can grok filesystem semantics and, ultimately, blocks on a storage device.

The insane innovation of TI calculator hobbyists

In the mid-to-late 2000s, you either knew, or were, that kid in grade school. You know. The one who could put games on your graphing calculator. You may be surprised to learn that some of these people didn’t exist totally in a vacuum. There was in fact a thriving scene of hackers who had bent these calculators to their will, writing games, math software, and more generally hacking on the platform just for the sake of it. True to my interests, it’s all deeply embedded, pushing the limits of platforms that were obsolete when they were released. I’ll take you through some of the highlights of Texas Instruments calculator hacking done over the past two and a half decades, along with an explanation of why these projects are so technically impressive. A friend of mine and I at high school bought the data transfer cable for our graphing calculators so we could play multiplayer Bomberman on them in class. Good times.

The software behind Eurovision

Whether you like the Eurovision Song Contest or not, no one can claim they don’t put on an extravagant show. After watching the performance last night, the thing that stayed with me most wasn’t the music, but the stunning lighting effects, visual effects and camera work. The 1831 lights, 24 cameras, 380 speakers and hundreds of mics all need orchestration in a way that’s hard to comprehend. Software makes it all possible. This year CuePilot was used to manage the entire production. CuePilot allows them to pre-programme all movement and to create a script for programming the lights, to pass to the camera operators and so on. It even allows them to create an entire pre-visualisation of the show — a 3D rendered simulation — before any footage has been shot. It’s so nice that we work now in, actually it feels like a videogame. I cut my shots in CuePilot, I send it to , they put it in the venue, and the venue is complete 3D of course now, with the light, with the movements, with the LED content and actually I see the song or the performance in actually real time and more-or-less real life. The objective is not just to create an elaborate show, but also to manage the emotions of the audience watching it. Gil Laufer, an MSc student at KTH Royal Institute of Technology in Stockholm and Eurovision fanatic, has researched the effect it has. Our hypothesis was that pre-programmed camera work will result in a more unified experience among the viewers. A unified experience means that in terms of emotions and their intensities, each individual among a group of viewers would feel the same as the other group members. This can be measured and later analyzed using statistical methods. The conclusions drawn from the research is that pre-programmed camera work can result in a more unified experience compared to manual camera work. The ability to do that depends on the overall creativity value of the production, which in turn depends on various aspects such as the number of cameras and the available shooting angles, the production team’s proficiency in using tools as CuePilot, and in the time that the team got to spend on the production. Musical productions may not be the usual fare for OSAlert, but the fact is that the sophistication of orchestration, simulation of the final show, and bridging between the software and the hardware its controlling, just wouldn’t be possible without the developments made in operating system and software integration over the last two decades.

Booting from a vinyl record

Most PCs tend to boot from a primary media storage, be it a hard disk drive, or a solid-state drive, perhaps from a network, or – if all else fails – the USB stick or the boot DVD comes to the rescue… Fun, eh? Boring! Why don’t we try to boot from a record player for a change? I hope he’s using gold-plated triple-insulated Monster cables with diamond tips and uranium signal repeaters, because otherwise the superior quality of the vinyl will get lost. Would be a shame.

The golden age of computer user groups

The Homebrew Computer Club where the Apple I got its start is deservedly famous—but it’s far from tech history’s only community gathering centered on CPUs. Throughout the 70s and into the 90s, groups around the world helped hapless users figure out their computer systems, learn about technology trends, and discover the latest whiz-bang applications. And these groups didn’t stick to Slacks, email threads, or forums; the meetings often happened IRL. But to my dismay, many young technically-inclined whippersnappers are completely unaware of computer user groups’ existence and their importance in the personal computer’s development. That’s a damned shame. Our current reality may largely be isolated to screens, but these organizations helped countless enthusiasts find community because of them. Computer groups celebrated the industry’s fundamental values: a delight in technology’s capabilities, a willingness to share knowledge, and a tacit understanding that we’re all here to help one another. And gosh, they were fun. I wonder if we’ll ever see a rebound, the pendulum swinging back, where people who grew up in the screen age long for more personal contact and reignite the interest in these old-fashioned user groups. After the current crisis is over, of course.

Recommended YouTube channel: PBS Space Time

I'm going to do something different today - I'm going to highlight a YouTube channel that I personally really enjoy, and that I think might be a good fit for OSAlert readers as well. I plan on doing this more often, since I feel a text/article-only focus leads to OSAlert missing out on a bunch of really great and informative content. The channels I'll be recommending will all be focused on technology and science, and since I have a deep disdain for the stereotypical spammy, clickbaity YouTube channels, you can be assured I'll only be recommending truly informative and quality channels.

I'm going to start off with a channel called PBS Space Time.

Space Time explores the outer reaches of space, the craziness of astrophysics, the possibilities of sci-fi, and anything else you can think of beyond Planet Earth with our astrophysicist host: Matthew O’Dowd.

Matt O'Dowd spends his time studying the universe, especially really far-away things like quasars, super-massive black holes, and evolving galaxies. He uses telescopes in space to do it. Matt completed his Ph.D. at NASA's Space Telescope Science Institute, followed by work at the University of Melbourne and Columbia University. He's now a professor at the City University of New York's Lehman College and an Associate at the American Museum of Natural History's Hayden Planetarium.

Space Time does not shy away from hardcore astrophysics, covering subjects like quantum field theory, quantum physics, the theory and mathematics of black holes and other astrophysical phenomena, and much more. Videos range from 10-15 minutes, and many subjects are spread out over many videos - the content is in-depth, purely scientific, and definitely not always easy to grasp. I'm currently watching Planck's Constant and the origin of quantum mechanics, just to give you an idea of what to expect.

There's a decent back catalog of videos to enjoy, so if this is your cup of tea, you've just got a whole bunch of new tea to drink.

MIT has developed a ‘system for dream control’

There is a borderland between waking life and the uncharted wilderness of sleep that we all traverse each night, but we rarely stop to marvel at the strangeness of this liminal world. If we do, we find that it is full of hallucinations both wonderful and terrifying, a mental goulash of reality and fantasy.

Usually we pass through this state of half-wakefulness on our way to deep sleep within minutes. We may experience microdreams during the transition, but the content of these microdreams appear to be random and we usually don't have any memory of them when we wake. A team of researchers led by MIT doctoral candidate Adam Horowitz wants to change that.

Horowitz and his colleagues at the MIT Media Lab have developed a relatively simple device called Dormio to interface with this unique stage of sleep. Their hypothesis is that this liminal period between wakefulness and sleep is a fount of creativity that is usually lost in the ocean of sleep. The thinking is that if you’re able to descend into that stage of sleep and return to consciousness without descending deeper into sleep, you will benefit from the intensely associative thinking that characterizes the strange microdreams experienced during the transition to sleep.

There's so much we don't know about sleeping, dreaming, and the brain as a whole, that I'd be quite nervous about using devices like these before we have a better understanding of our brain. Still, if it works, this is quite cool.

What Deep Space Nine does that no other Star Trek series can

Fifty years ago today, on 8 September 1966, Gene Roddenberry's Star Trek appeared on television for the first time - and forever changed the world. There's an endless string of articles all over the web, but as one of those Deep Space Nine fanatics the rest of the Trek world rather not talk about, this great article by Annalee Newitz really struck a cord with me.

Without this stubborn nugget of hope at its core, DS9 would be more like the 2000s version of Battlestar Galactica - a story about space mysticism and war that's laced with a fatalism about humanity. Ron Moore was an executive producer on DS9 and the creator of BSG, so the overlap makes sense. But on DS9, we are immersed in a world where our faith in the basic decency of intelligent beings can remain unshaken. Whether solid or liquid, most of the creatures who live on the space station always do the right thing. And most importantly, the good guys prevail not just because they are good, but because they are able to put their ideals to practical use. More than TNG and Voyager, DS9 helps us understand how humans got from the Bell Riots to social democracy in space. Our heroes do it by resisting imperialism and inequality and by allying themselves with other people who do. That's why the Federation has struck a deal with the Bajorans rather than the Cardassians.

"Do you think they'll be able to save us?" The best scene in all of Star Trek (this one's a close second).

Thank you, Mr. Roddenberry.

A new theory explains how consciousness evolved

Ever since Charles Darwin published On the Origin of Species in 1859, evolution has been the grand unifying theory of biology. Yet one of our most important biological traits, consciousness, is rarely studied in the context of evolution. Theories of consciousness come from religion, from philosophy, from cognitive science, but not so much from evolutionary biology. Maybe that's why so few theories have been able to tackle basic questions such as: What is the adaptive value of consciousness? When did it evolve and what animals have it?

The Attention Schema Theory (AST), developed over the past five years, may be able to answer those questions. The theory suggests that consciousness arises as a solution to one of the most fundamental problems facing any nervous system: Too much information constantly flows in to be fully processed. The brain evolved increasingly sophisticated mechanisms for deeply processing a few select signals at the expense of others, and in the AST, consciousness is the ultimate result of that evolutionary sequence. If the theory is right - and that has yet to be determined - then consciousness evolved gradually over the past half billion years and is present in a range of vertebrate species.

I know this really isn't what you'd generally expect to be posted here, but the concept of consciousness - one of a small set of words in the English language I cannot spell from the top of my head without making errors - is one of those things that, when you think too deeply about it, you enter into a realm of thinking that can get deeply uncomfortable and distressing, like thinking about what's outside the universe or what "existed" "before" (quotes intentional) the big bang.

Personally, I'm one of those insufferable people who ascribes the entire concept of consciousness to the specific arrangement of neurons and related tissue in our brain and wider nervous system - I don't accept religion or some other specific magical thing that makes us humans (and dolphins? And chimpansees? And whatever else has some level of consciousness?) more special than any other animal in terms of consciousness.

I also don't like the controversial concept of splitting consciousness up into an easy and a hard problem, because to me, that just opens the door to maintaining the religious idea that humans are somehow more special than other animals - sure, science has made it clear some other animals have easy consciousness, but humans are still special because we are the only ones with hard consciousness. It reeks of an artificial cutoff point created to maintain some semblance of uniqueness for homo sapiens sapiens so we can feel good about ourselves.

You can take the whole concept of consciousness in every which way, and one of my recent favourites is CGP Grey's video The Trouble With Transporters, which, among other tings, poses the question - if you interrupt your consciousness by being teleported or going to sleep, are you really the same person when you rematerialise or wake up?

Have fun!

Kepler mission discovers bigger, older cousin to Earth

NASA's Kepler mission has confirmed the first near-Earth-size planet in the "habitable zone" around a sun-like star. This discovery and the introduction of 11 other new small habitable zone candidate planets mark another milestone in the journey to finding another "Earth."

All the recent successes in space - Philae/Rosetta, New Horizons, the never-ending stream of discoveries from Keppler, like this one - actually make me sad, because it makes me wonder how much more we could've achieved and discovered has we not developed this anti-science and pro-war climate we've been living in for a while now.

Maybe these new achievements will reignite the hunger for space. We can hope.

Leonard Nimoy, Spock of ‘Star Trek’, dies at 83

Leonard Nimoy, the sonorous, gaunt-faced actor who won a worshipful global following as Mr. Spock, the resolutely logical human-alien first officer of the Starship Enterprise in the television and movie juggernaut "Star Trek," died on Friday morning at his home in the Bel Air section of Los Angeles. He was 83.

"Of my friend, I can only say this: of all the souls I have encountered in my travels, his was the most... Human."

Elon Musk Is Warning us of Rogue Artificial Intelligence

We've highlighted the dire warnings of Tesla and SpaceX founder Elon Musk in recent months regarding the perils of artificial intelligence, but this week he actually managed to raise the bar in terms of making A.I. seem scary. More at Mashable. My take: I worked on AI 20 years ago (wow, time flies). I don't believe that we will ever create anything truly sentient. Intelligent and useful for our needs, yes. But truly sentient, no. For something to become evil, it must be sentient. Anything else, if it ever becomes problematic, it would just be software bugs, not evilness.

7 Awesome Bits of Tech That Just Freakin^aEURTM Disappeared

Carol Pinchefsky contemplates commercial skipping DVRs, and other tales of really good technology that vanished, in 7 Awesome Bits of Tech That Just Freakin' Disappeared. As Pinchefsky writes: "...It got me thinking about awesome technology that we somehow ditched. The airship? Awesome. Slide rules? Awesome awesome. Mir Space Station? Boss-level awesome. And now just thinking about wristwatches with calculators makes me suffer a sense of short-term nostalgia (as in Douglas Coupland's Generation X). Here are some of the coolest features and products that we’ve lost along the way to 2012.

Wall Street’s Cult Calculator Turns 30

Matthew Rothman bought an HP 12c financial calculator for his first job out of college in 1989. Years later, he still has the same calculator. And he still uses it constantly, just like thousands of other 12c enthusiasts. "Whenever I switch jobs, I just peel the old business card that is on the back and tape my newest one on," says Mr. Rothman, head of quantitative equity strategies at Barclays Capital in New York. Sales of the device, which debuted in 1981, haven't slipped even after its manufacturer, Hewlett-Packard Co., introduced more-advanced devices or even, two years ago, a 12c iPhone application, which replicates all the calculator's functions, the company says.