Microsoft laid off its entire ethics and society team within the artificial intelligence organization as part of recent layoffs that affected 10,000 employees across the company, Platformer has learned.
The move leaves Microsoft without a dedicated team to ensure its AI principles are closely tied to product design at a time when the company is leading the charge to make AI tools available to the mainstream, current and former employees said.
Oh so that’s totally not worrying at all or anything.
I’ve always thought that these teams were set up to fail. Economic pressures will always trump ethical concerns. As you can see from the exchange, the exec didn’t really even care one bit. Those on the team likely meant well and were well intentioned, but they have no power. No veto of product feature or rollout.
Yeah, this was probably set up as a PR move at some point to check some box.
It’s part of the dissonance of the current system.
Actually, I read recently that a ton of employees at MSFT, APPL, and GOOG were bought up and given BS jobs just to try to keep others from getting anybody decent, so I really wouldn’t be surprised if those poor folks were just stuck in a room doing jack with a side of squat just to keep someone else from having them.
Re wrote this message a few times I want to be careful with how I say this. No, they aren’t sitting in a room doing nothing. I’ve experienced that before. They have active projects that they do complete and work, but the value of them is rather small. It has meaning to the devs and they do the job well. Its just not a job that has much value to the company as a whole.
I didn’t mean they were literally sitting there playing Minecraft on their laptops, I mean they were given jobs the higher ups gave not 1 single flying fart about just to keep them from going somewhere else.
I’m sure these folks actually cared about what they were doing but I have serious doubts the higher ups ever gave a crap or they wouldn’t have been tossed so easily. It would be like if GOOG gave you a job studying the environmental impact of discontinued GOOG products on the environment, I mean sure YOU probably care about the planet as do I, but do ya REALLY think GOOG at the CEO level gives a damn what happens to dead Pixel phones and Google Stadia boxes in 2023? He probably thinks more about what he is gonna have for lunch than all that E-waste.
bassbeast,
Yes, I understand what you are saying. Executives might have created the team to promote a certain image, but in the end it’s doubtful the team would have carried much weight.
Yeah probably. Speaking of devices relegated to e-waste… I need to buy a new thermostat. The one I have was functional but they’ve announced that they’ll be terminating the upstream service this year. I absolutely despise the idea of buying another one that is locked to the vendor’s proprietary servers only to end up having to throw it away again later. I don’t really want to fabricate my own hardware, does anyone have experience with a ready-to-use thermostat with an open API and phone app? I can host my own services, but it’s very important to me that it’s not dependent on proprietary vendor control channels. I want nothing to do with google “nest” and other devices that are similarly vendor locked.
@Alfman
Experience no, but maybe this is a start:
https://opensource.com/article/21/3/thermostat-raspberry-pi
Moochman,
Thanks for the link. I was hoping not to build my own hardware, but I guess it might be a big ask to buy a working product with an open software stack.
Incidentally I needed a raspberry PI for something else, but I couldn’t get my hands on one because it’s been out of stock for years, I checked today and…yep out of stock. As much as I appreciate the openness of the PI platform, so far it has continued to be plagued by availability issues since it’s inception to the point where depending on PI is a huge risk. It’s a shame android phones aren’t more DIY friendly because so many of them will end up in the garbage anyways.
In terms of building my own thermostat, there’s so many ways to do it with a microcontroller or whatever. the programming and soldering skills aren’t a major hurdle, but the hobby cases I build never look great and a store bought product would be much cleaner than what I can do. I’d like to try 3d printing but don’t rally have the space for one. I dream of having my own house with a workshop but that’s a whole other thing, haha.
Whoever told you that has no idea whatsoever how MSFT, APPL, and GOOG actually work.
There are tons of projects internally that go nowhere or get cancelled all the time. Not everything is a conspiracy.
These companies are very competitive internally, and there is no way any one is hired to do nothing.
What a crock.
If the data fed into an AI system is biased, the results will display a bias. By removing all bias, it will enable Orwellian censorship.
It sounds as if this team was holding back AI development.
Go woke, go broke. Darwinism is inevitable.
JustinGoldberg,
When did so many people get brainwashed into becoming Pavlov dogs in terms of blaming wokism for everything? Sure I get the gripes against woke and It’s one thing to criticize woke culture in the movies where it’s having a very real impact on the medium. But this absolutely mindless response to blame everything under the sun on “woke” is ridiculous propaganda. I bet you blame the failure of massive banks this week on “woke” too.
https://www.vox.com/money/23638473/silicon-valley-bank-failure-fdic-republicans
Agree (with JustinGoldberg). Yeah everyone knows AI shouldn’t become a real-life Terminator one day. But all the “AI ethics” stuff these days is just thinly veiled attempts at censorship and political manipulation.
Everyone laughs at the Baidu AI as we all know certain topics are no-no in China, but are we in the West going to create a free-thinking, uncensored AI or are we also afraid of that?
j0scher,
(My emphasis)
Do they really though? Laying off the ethics team seems to be the first step in disregarding the harmful impacts of future development. I wouldn’t be surprised if they are already mulling military AI contracts.
Do you have any evidence that the ethics team attempted censorship and political manipulation? You might have it backwards, without an ethics team the chances of AI being used for censorship and political manipulation may actually be higher. The ethics team may be the last chance to have someone stand up and say “hey maybe we shouldn’t be doing this”.
I think we need to look at AI far more broadly. It’s not just chat bots where biases might even be comical, but for other applications the consequences can be far more dire. For example I’m very concerned about police applications where AI might one day be used for profiling. Black box AI is already increasingly being used to determine our employment.
https://www.technologyreview.com/2021/08/04/1030509/job-search-how-write-resume-ai-artificial-intelligence/
I’m not anti-AI and I accept AI having many benefits, but at the same time I believe that dismissive attitudes about AI ethics are needlessly reckless.
I always wondered how skynet could be made. I don’t wonder anymore. Dumb adherence to iodic soundbites, without any ability to analyze data. That’s how. We want a murderous AI that kills anything it wants to because we don’t want a woke ai. Genius. simply genius. I kind of wonder if AI doesn’t hasn’t already existed and planted some of these dumb ass memes to allow itself to be unleashed and make it look like it was our idea.
Bill Shooter of Bul,
I really think you are onto something. Get people to forget about objectivity and instead just cast everything as “us vs them”. It astonishes me just how effective this type of red meat is at manipulating people. I know people like this IRL. I generally accept people have different viewpoints, but when someone’s world view is centered around woke liberals being guilty as the inherent truth around which everything else gets based on, there’s no rhyme or reason for anything. Just leave logic and facts and contradictions at the door. The lesson I gravitate to is that animosity can blind people to logic and facts. But I guess the more genius lesson would be that logic and facts can be defeated by exploiting animosity.
>We want a murderous AI that kills anything it wants to because we don’t want a woke ai.
Better dead than red
joscher,
What does mccarthy era war propaganda have to do with this?
The irony being that you clearly don’t understand the basics of Darwin theories.
I think is they just relabelled AI as SI people might actually understand the situation and be less weird about it (simulated in case you could not make the leap).
Or course thier is the argument that one day SI may become AI, but not anytime soon. Ther use “AI” more as a selling point which confuses many who are not on the development side.
Carrot007,
How do you distinguish these?
Won’t “artificial intelligence” always be simulated? Won’t “simulated intelligence” always be artificial?
I’d say we already have “AI”, but this isn’t to be confused with general AI/general intelligence. Evidently we’re not using the same terminology, but I generally go by this definition of AI.
https://en.wikipedia.org/wiki/Artificial_intelligence
It looks a bit like the plot of a 1980s scifi B-Movie. Like a scam version of terminator. Not worried at all.
Anacardo,
T3, when skynet boots up…
https://www.youtube.com/watch?v=_Wlsd9mljiU
In terms of the movie, I actually wish the plot did not have a virus planted into skynet. Skynet should have turned against humans on it’s own.
Drones like this could be a plausible scenario for military AI. However, it seems rather unlikely that such a thing wouldn’t have a kill switch. Also, even if it were to go on rampage for a while, it wouldn’t last long without human technicians, munitions factories, power plants, etc. Human engineers themselves would need to do everything possible to make these bots fully self-sufficient, including the ability to gather and process resources. It all seems somewhat contrived. That said, maybe the AI would know enough to be able to take hostages to convince the engineers to help it until it could actually become self sufficient.
Or maybe we come up with a scenario where AI drones need to be self sufficient in the first place by design, like remote space mining. The robots build up their own colony in secret and come back to attack humans
Remote space mine gone rogue….. mmmh you’ve got a nice plot for a scifi movie right there!!!
Think of Aliens with robots.
Due to stress, many people begin to experience health problems. But I know how to solve this problem. Visit https://westcoastsupply.cc/what-is-hash/ to find out how to deal with stress at home with a natural product made from cannabis. Explore the uses and efficacy for nervous disorders. This is a great solution that will help you get rid of psycho-emotional problems.