Businesses

Amazon Pitches AI Tools as Co-Workers While Axing Jobs 32

Amazon used its annual re:Invent cloud conference in Las Vegas to pitch a vision of the workplace where AI agents serve not as tools but as "co-workers" and "teammates," even as the company proceeds with eliminating roughly 14,000 corporate jobs in its second major workforce reduction in recent years.

AWS CEO Matt Garman predicted on stage that autonomous "frontier agents" could represent 80 to 90% of enterprise AI value. Colleen Aubrey, senior vice president of applied AI solutions, described a future where companies manage "teams" of agents capable of working autonomously for hours or days while humans shift into supervisory roles. Amazon has already deployed agentic systems across tens of thousands of its own engineers to triage outages and propose fixes. The company calls these systems "teammates" rather than tools. CEO Andy Jassy has warned that AI would shrink Amazon's workforce, though a spokesperson attributed the current cuts to "reducing bureaucracy" and "removing layers" rather than AI deployment.
Open Source

How Home Assistant Leads a 'Local-First Rebellion' (github.blog) 100

It runs locally, a free/open source home automation platform connecting all your devices together, regardless of brand. And GitHub's senior developer calls it "one of the most active, culturally important, and technically demanding open source ecosystems on the planet," with tens of thousands of contributors and millions of installations.

That's confirmed by this year's "Octoverse" developer survey... Home Assistant was one of the fastest-growing open source projects by contributors, ranking alongside AI infrastructure giants like vLLM, Ollama, and Transformers. It also appeared in the top projects attracting first-time contributors, sitting beside massive developer platforms such as VS Code... Home Assistant is now running in more than 2 million households, orchestrating everything from thermostats and door locks to motion sensors and lighting. All on users' own hardware, not the cloud. The contributor base behind that growth is just as remarkable: 21,000 contributors in a single year...

At its core, Home Assistant's problem is combinatorial explosion. The platform supports "hundreds, thousands of devices... over 3,000 brands," as [maintainer Franck Nijhof] notes. Each one behaves differently, and the only way to normalize them is to build a general-purpose abstraction layer that can survive vendor churn, bad APIs, and inconsistent firmware. Instead of treating devices as isolated objects behind cloud accounts, everything is represented locally as entities with states and events. A garage door is not just a vendor-specific API; it's a structured device that exposes capabilities to the automation engine. A thermostat is not a cloud endpoint; it's a sensor/actuator pair with metadata that can be reasoned about.

That consistency is why people can build wildly advanced automations. Frenck describes one particularly inventive example: "Some people install weight sensors into their couches so they actually know if you're sitting down or standing up again. You're watching a movie, you stand up, and it will pause and then turn on the lights a bit brighter so you can actually see when you get your drink. You get back, sit down, the lights dim, and the movie continues." A system that can orchestrate these interactions is fundamentally a distributed event-driven runtime for physical spaces. Home Assistant may look like a dashboard, but under the hood it behaves more like a real-time OS for the home...

The local-first architecture means Home Assistant can run on hardware as small as a Raspberry Pi but must handle workloads that commercial systems offload to the cloud: device discovery, event dispatch, state persistence, automation scheduling, voice pipeline inference (if local), real-time sensor reading, integration updates, and security constraints. This architecture forces optimizations few consumer systems attempt.

"If any of this were offloaded to a vendor cloud, the system would be easier to build," the article points out. "But Home Assistant's philosophy reverses the paradigm: the home is the data center..."

As Nijhof says of other vendor solutions, "It's crazy that we need the internet nowadays to change your thermostat."
Unix

New FreeBSD 15 Retires 32-Bit Ports and Modernizes Builds (theregister.com) 32

FreeBSD 15.0-RELEASE arrived this week, notes this report from The Register, which calls it the latest release "of the Unix world's leading alternative to Linux." As well as numerous bug fixes and upgrades to many of its components, the major changes in this version are reductions in the number of platforms the OS supports, and in how it's built and how its component software is packaged.

FreeBSD 15 has significantly reduced support for 32-bit platforms. Compared to FreeBSD 14 in 2023, there are no longer builds for x86-32, POWER, or ARM-v6. As the release notes put it:

"The venerable 32-bit hardware platforms i386, armv6, and 32-bit powerpc have been retired. 32-bit application support lives on via the 32-bit compatibility mode in their respective 64-bit platforms. The armv7 platform remains as the last supported 32-bit platform. We thank them for their service."

Now FreeBSD supports five CPU architectures — two Tier-1 platforms, x86-64 and AArch64, and three Tier-2 platforms, armv7 and up, powerpc64le, and riscv64.

Arguably, it's time. AMD's first 64-bit chips started shipping 22 years ago. Intel launched the original x86 chip, the 8086 in 1978. These days, 64-bit is nearly as old as the entire Intel 80x86 platform was when the 64-bit versions first appeared. In comparison, a few months ago, Debian 13 also dropped its x86-32 edition — six years after Canonical launched its first x86-64-only distro, Ubuntu 19.10.

Another significant change is that this is the first version built under the new pkgbase system, although it's still experimental and optional for now. If you opt for a pkgbase installation, then the core OS itself is installed from multiple separate software packages, meaning that the whole system can be updated using the package manager. Over in the Linux world, this is the norm, but Linux is a very different beast... The plan is that by FreeBSD 16, scheduled for December 2027, the restructure will be complete, the old distribution sets will be removed, and the current freebsd-update command and its associated infrastructure can be turned off.

Another significant change is reproducible builds, a milestone the project reached in late October. This change is part of a multi-project initiative toward ensuring deterministic compilation: to be able to demonstrate that a certain set of source files and compilation directives is guaranteed to produce identical binaries, as a countermeasure against compromised code. A handy side-effect is that building the whole OS, including installation media images, no longer needs root access.

There are of course other new features. Lots of drivers and subsystems have been updated, and this release has better power management, including suspend and resume. There's improved wireless networking, with support for more Wi-Fi chipsets and faster wireless standards, plus updated graphics drivers... The release announcement calls out the inclusion of OpenZFS 2.4.0-rc4, OpenSSL 3.5.4, and OpenSSH 10.0 p2, and notes the inclusion of some new quantum-resistant encryption systems...

In general, we found FreeBSD 15 easier and less complicated to work with than either of the previous major releases. It should be easier on servers too. The new OCI container support in FreeBSD 14.2, which we wrote about a year ago, is more mature now. FreeBSD has its own version of Podman, and you can run Linux containers on FreeBSD. This means you can use Docker commands and tools, which are familiar to many more developers than FreeBSD's native Jail system.


"FreeBSD has its own place in servers and the public cloud, but it's getting easier to run it as a desktop OS as well," the article concludes. "It can run all the main Linux desktops, including GNOME on Wayland."

"There's no systemd here, and never will be — and no Flatpak or Snap either, for that matter.
Wireless Networking

Why One Man Is Fighting For Our Right To Control Our Garage Door Openers (nytimes.com) 126

An anonymous reader quotes a report from the New York Times: A few years ago, Paul Wieland, a 44-year-old information technology professional living in New York's Adirondack Mountains, was wrapping up a home renovation when he ran into a hiccup. He wanted to be able to control his new garage door with his smartphone. But the options available, including a product called MyQ, required connecting to a company's internet servers. He believed a "smart" garage door should operate only over a local Wi-Fi network to protect a home's privacy, so he started building his own system to plug into his garage door. By 2022, he had developed a prototype, which he named RATGDO, for Rage Against the Garage Door Opener. He had hoped to sell 100 of his new gadgets just to recoup expenses, but he ended up selling tens of thousands. That's because MyQ's maker did what a number of other consumer device manufacturers have done over the last few years, much to the frustration of their customers: It changed the device, making it both less useful and more expensive to operate.

Chamberlain Group, a company that makes garage door openers, had created the MyQ hubs so that virtually any garage door opener could be controlled with home automation software from Apple, Google, Nest and others. Chamberlain also offered a free MyQ smartphone app. Two years ago, Chamberlain started shutting down support for most third-party access to its MyQ servers. The company said it was trying to improve the reliability of its products. But this effectively broke connections that people had set up to work with Apple's Home app or Google's Home app, among others. Chamberlain also started working with partners that charge subscriptions for their services, though a basic app to control garage doors was still free.

While Mr. Wieland said RATGDO sales spiked after Chamberlain made those changes, he believes the popularity of his device is about more than just opening and closing a garage. It stems from widespread frustration with companies that sell internet-connected hardware that they eventually change or use to nickel-and-dime customers with subscription fees. "You should own the hardware, and there is a line there that a lot of companies are experimenting with," Mr. Wieland said in a recent interview. "I'm really afraid for the future that consumers are going to swallow this and that's going to become the norm." [...] For Mr. Wieland, the fight isn't over. He started a company named RATCLOUD, for Rage Against the Cloud. He said he was developing similar products that were not yet for sale.

Microsoft

Microsoft Faces New Complaint For Unlawfully Processing Data On Behalf of Israeli Military (aljazeera.com) 53

Ancient Slashdot user Alain Williams shares a report from Al Jazeera: The Irish Council for Civil Liberties (ICCL) has announced it filed a complaint against Microsoft, accusing the global tech giant of unlawfully processing data on behalf of the Israeli military and facilitating the killings of Palestinian civilians in Gaza. In the complaint, the council asked the Data Protection Commission -- the European Union's lead data regulator for the company -- to "urgently investigate" Microsoft Ireland's processing.

"Microsoft's technology has put millions of Palestinians in danger. These are not abstract data-protection failures -- they are violations that have enabled real-world violence," Joe O'Brien, ICCL's executive director, said in a statement. "When EU infrastructure is used to enable surveillance and targeting, the Irish Data Protection Commission must step in -- and it must use its full powers to hold Microsoft to account."

After months of complaints from rights groups and Microsoft whistleblowers, the company said in September it cancelled some services to the Israeli military over concerns that it was violating Microsoft's terms of service by using cloud computing software to spy on millions of Palestinians.

Microsoft

Microsoft Lowers AI Software Sales Quota As Customers Resist New Products (reuters.com) 32

An anonymous reader quotes a report from Reuters: Multiple divisions at Microsoft have lowered sales growth targets for certain artificial intelligence products after many sales staff missed goals in the fiscal year that ended in June, The Information reported on Wednesday. It is rare for Microsoft to lower quotas for specific products, the report said, citing two salespeople in the Azure cloud unit. The division is closely watched by investors as it is the main beneficiary of Microsoft's AI push. [...]

The Information report said Carlyle Group last year started using Copilot Studio to automate tasks such as meeting summaries and financial models, but cut its spending on the product after flagging Microsoft about its struggles to get the software to reliably pull data from other applications. The report shows the industry was in the early stages of adopting AI, said D.A. Davidson analyst Gil Luria. "That does not mean there isn't promise for AI products to help companies become more productive, just that it may be harder than they thought."

Cloud

Amazon and Google Announce Resilient 'Multicloud' Networking Service Plus an Open API for Interoperability (reuters.com) 21

Their announcement calls it "more than a multicloud solution," saying it's "a step toward a more open cloud environment. The API specifications developed for this product are open for other providers and partners to adopt, as we aim to simplify global connectivity for everyone."

Amazon and Google are introducing "a jointly developed multicloud networking service," reports Reuters. "The initiative will enable customers to establish private, high-speed links between the two companies' computing platforms in minutes instead of weeks." The new service is being unveiled a little over a month after an Amazon Web Services outage on October 20 disrupted thousands of websites worldwide, knocking offline some of the internet's most popular apps, including Snapchat and Reddit. That outage will cost U.S. companies between $500 million and $650 million in losses, according to analytics firm Parametrix.
Google and Amazon are promising "high resiliency" through "quad-redundancy across physically redundant interconnect facilities and routers," with both Amazon and Google continuously watching for issues. (And they're using MACsec encryption between the Google Cloud and AWS edge routers, according to Sunday's announcement: As organizations increasingly adopt multicloud architectures, the need for interoperability between cloud service providers has never been greater. Historically, however, connecting these environments has been a challenge, forcing customers to take a complex "do-it-yourself" approach to managing global multi-layered networks at scale.... Previously, to connect cloud service providers, customers had to manually set up complex networking components including physical connections and equipment; this approach required lengthy lead times and coordinating with multiple internal and external teams. This could take weeks or even months. AWS had a vision for developing this capability as a unified specification that could be adopted by any cloud service provider, and collaborated with Google Cloud to bring it to market.

Now, this new solution reimagines multicloud connectivity by moving away from physical infrastructure management toward a managed, cloud-native experience.

Reuters points out that Salesforce "is among the early users of the new approach, Google Cloud said in a statement."
Businesses

Amazon Tells Its Engineers: Use Our AI Coding Tool 'Kiro' (yahoo.com) 25

"Amazon suggested its engineers eschew AI code generation tools from third-party companies in favor of its own ," reports Reuters, "a move to bolster its proprietary Kiro service, which it released in July, according to an internal memo viewed by Reuters." In the memo, posted to Amazon's internal news site, the company said, "While we continue to support existing tools in use today, we do not plan to support additional third party, AI development tools.

"As part of our builder community, you all play a critical role shaping these products and we use your feedback to aggressively improve them," according to the memo.

The guidance would seem to preclude Amazon employees from using other popular software coding tools like OpenAI's Codex, Anthropic's Claude Code, and those from startup Cursor. That is despite Amazon having invested about $8 billion into Anthropic and reaching a seven-year $38 billion deal with OpenAI to sell it cloud-computing services..."To make these experiences truly exceptional, we need your help," according to the memo, which was signed by Peter DeSantis, senior vice president of AWS utility computing, and Dave Treadwell, senior vice president of eCommerce Foundation. "We're making Kiro our recommended AI-native development tool for Amazon...."

In October, Amazon revised its internal guidance for OpenAI's Codex to "Do Not Use" following a roughly six month assessment, according to a memo reviewed by Reuters. And Claude Code was briefly designated as "Do Not Use," before that was reversed following a reporter inquiry at the time.

The article adds that Amazon "has been fighting a reputation that it is trailing competitors in development of AI tools as rivals like OpenAI and Google speed ahead..."
The Internet

The Battle Over Africa's Great Untapped Resource: IP Addresses (msn.com) 55

In his mid-20s, Lu Heng "got an idea that has made him a lot richer," writes the Wall Street Journal.

He scooped up 10 million unused IP addresses, mostly form Africa, and then leases them to companies, mostly outside Africa, "that need them badly." [A]round half of internet traffic continues to use IPv4, because changing to IPv6 can be expensive and complex and many older devices still need IPv4. Companies including Amazon, Microsoft and Google still want IPv4 addresses because their cloud-hosting businesses need them as bridges between the IPv4 and IPv6 worlds... Africa, which has been slower to develop internet infrastructure than the rest of the world, is the only region that still has some of the older addresses to dole out... He searches for IPv4 addresses that aren't being used — by ISPs or anyone else that holds them — and uses his Hong Kong-based company, Larus, to lease them out to others.

In 2013, Lu registered a new company in the Seychelles, an African archipelago in the Indian Ocean, to apply for IP addresses from Africa's internet registry, called the African Network Information Centre, or Afrinic. Between 2013 and 2016, Afrinic granted that company, Cloud Innovation, 6.2 million IPv4 addresses. That's more addresses than are assigned to Nigeria, Africa's most populous nation. A single IPv4 address can be worth about $50 on its transfer to a company like Larus, which leases it onward for around 5% to 10% of that value annually. Larus and its affiliate companies, Lu said, control just over 10 million IPv4 addresses. The architects of the internet don't appear to have contemplated the possibility that anyone would seek to monetize IP addresses...

Lu's activities triggered a showdown with Africa's internet registry. In 2020, after what it said was an internal review, Afrinic sent letters to Lu and others seeking to reclaim the IP addresses they held. In Lu's case, Afrinic said he shouldn't be using the addresses outside Africa. Lu responded that he wasn't violating rules in place when he got the addresses... After some back-and-forth, Lu sued Afrinic in Mauritius to keep his allocated addresses, eventually filing dozens of lawsuits... One of the lawsuits that Lu filed in Mauritius prompted a court there to freeze Afrinic's bank accounts in July 2021, effectively paralyzing the organization and eventually sending it into receivership. The receivership choked off distributions of new IPv4 addresses, leaving the continent's service providers struggling to expand capacity...

In September, Afrinic elected a new board. Since then, some internet-service providers have been granted IPv4 addresses.

Cloud

AWS Introduces DNS Failover Feature for Its Notoriously Unreliable US East Region (theregister.com) 25

Amazon Web Services has rolled out a DNS resilience feature that allows customers to make domain name system changes within 60 minutes of a service disruption in its US East region, a direct response to the long history of outages at the cloud giant's most troubled infrastructure.

AWS said customers in regulated industries like banking, fintech and SaaS had asked for additional capabilities to meet business continuity and compliance requirements, specifically the ability to provision standby resources or redirect traffic during unexpected regional disruptions. The 60-minute recovery time objective still leaves a substantial window for outages to cascade, and the timing of the announcement -- less than six weeks after an October 20th DynamoDB incident and a subsequent VM problem drew criticism -- underscores how persistent US East's reliability issues have been.
Microsoft

Seven Years Later, Airbus is Still Trying To Kick Its Microsoft Habit (theregister.com) 92

Breaking free from Microsoft is harder than it looks. Airbus began migrating its 100,000-plus workforce from Office to Google Workspace more than seven years ago and it still hasn't completed the switch. The Register: As we exclusively revealed in March 2018, the aerospace giant told 130,000 employees it was ditching Microsoft's productivity tools for Google's cloud-based alternatives. Then-CEO Tom Enders predicted migration would finish in 18 months, a timeline that, in hindsight, was "extremely ambitious," according to Catherine Jestin, Airbus's executive vice president of digital.

Today, more than two-thirds of Airbus's 150,000 employees have fully transitioned, but significant pockets continue to use Microsoft in parallel. Finance, for example, still relies on Excel because Google Sheets can't handle the necessary file sizes, as some spreadsheets involve 20 million cells. "Some of the limitations was just the number of cells that you could have in one single file. We'll definitely start to remove some of the work," Jestin told The Register.

The Almighty Buck

OpenAI Needs At Least $207 Billion By 2030 Just To Keep Losing Money, HSBC Estimates (ft.com) 83

OpenAI will need to raise at least $207 billion in new funding by 2030 to sustain operations while continuing to lose money, according to a new analysis from HSBC that models the company's cloud computing commitments against projected revenue. The bank's US software team updated its forecasts after OpenAI announced a $250 billion cloud compute rental deal with Microsoft in late October and a $38 billion deal with Amazon days later, bringing total contracted compute capacity to 36 gigawatts.

HSBC projects cumulative rental costs of $792 billion through 2030. Revenue growth remains strong in the model -- the bank expects OpenAI to reach 3 billion users by decade's end, up from roughly 800 million today -- but costs rise in lockstep, meaning OpenAI will still be subsidizing users well into the next decade. If revenue growth disappoints and investors turn cautious, the company's best option might be walking away from some data center commitments.
Mars

NASA Rover Makes a Shocking Discovery: Lightning on Mars (nytimes.com) 15

An anonymous reader shares a report: It is shocking but not surprising. Lightning crackles on Mars, scientists reported on Wednesday. What they observed, however, were not jagged, high-voltage bolts like those on Earth, arcing thousands of feet from cloud to ground. Rather, the phenomenon was more like the shock you feel when you scuff your feet on the carpet on a cold winter morning and then touch a metal doorknob.

"This is like mini-lightning on Mars," Baptiste Chide, a scientist at the Research Institute in Astrophysics and Planetary Science in Toulouse, France, said of the centimeter-scale electrical discharges. Dr. Chide and his colleagues reported the findings in a paper published on Wednesday in the journal Nature. The electrical sparks, although not as dramatically violent as on Earth, could play an important role in chemical reactions in the Martian atmosphere.

AI

Nvidia Claims 'Generation Ahead' Advantage After $200 Billion Sell-off on Google Fears 33

Nvidia pushed back against investor concerns about Google's competitive positioning in AI on Tuesday after the chipmaker's shares tumbled 4.4% and erased nearly $200 billion in market cap on fears that Alphabet's tensor processing units were gaining ground against its dominance in AI computing. The company said it was "delighted by Google's success" but asserted that it continues to supply chips to Google.

Nvidia said it remains "a generation ahead of the industry" as the only platform that runs every AI model and operates everywhere computing is done. The statement came after investors reacted to the release of Google's Gemini 3 large language model last week. The model was trained using TPUs rather than Nvidia chips. A report in The Information on Monday said Google was pitching potential clients including Meta on using TPUs in their data centers rather than Nvidia's chips.

Nvidia said its platform offers "greater performance, versatility, and fungibility than ASICs," referring to application-specific integrated circuits like Google's TPUs that are designed for specific AI frameworks or functions. Google's TPUs have until now only been available for customers to rent through its cloud computing service. Nvidia has lost more than $800 billion in market value since it peaked above $5 trillion less than a month ago.
Government

Trump Launches Genesis Mission, a Manhattan Project-Level AI Push (nerds.xyz) 102

BrianFagioli writes: President Trump has issued a sweeping executive order that creates the Genesis Mission, a national AI program he compares to a Manhattan Project level effort. It centralizes DOE supercomputers, national lab resources, massive scientific datasets, and new AI foundation models into a single platform meant to fast track research in areas like fusion, biotech, microelectronics, and advanced manufacturing. The order positions AI as both a scientific accelerator and a national security requirement, with heavy emphasis on data access, secure cloud environments, classification controls, and export restrictions.

The mission also sets strict timelines for identifying key national science challenges, integrating interagency datasets, enabling AI run experimentation, and creating public private research partnerships. Whether this becomes an effective scientific engine or another oversized federal program remains to be seen, but the administration is clearly pushing to frame Trump as the president who put AI at the center of U.S. research strategy.

AI

Amazon Pledges Up To $50 Billion To Expand AI, Supercomputing For US Government 15

Amazon is committing up to $50 billion to massively expand AI and supercomputing capacity for U.S. government cloud regions, adding 1.3 gigawatts of high-performance compute and giving federal agencies access to its full suite of AI tools. Reuters reports: The project, expected to break ground in 2026, will add nearly 1.3 gigawatts of artificial intelligence and high-performance computing capacity across AWS Top Secret, AWS Secret and AWS GovCloud regions by building data centers equipped with advanced compute and networking technologies. The project, expected to break ground in 2026, will add nearly 1.3 gigawatts of artificial intelligence and high-performance computing capacity across AWS Top Secret, AWS Secret and AWS GovCloud regions by building data centers equipped with advanced compute and networking technologies.

Under the latest initiative, federal agencies will gain access to AWS' comprehensive suite of AI services, including Amazon SageMaker for model training and customization, Amazon Bedrock for deploying models and agents, as well as foundation models such as Amazon Nova and Anthropic Claude. The federal government seeks to develop tailored AI solutions and drive cost-savings by leveraging AWS' dedicated and expanded capacity.
Hardware

Arduino's New Terms of Service Worries Hobbyists Ahead of Qualcomm Acquisition (arstechnica.com) 45

An anonymous reader quotes a report from Ars Technica: Some members of the maker community are distraught about Arduino's new terms of service (ToS), saying that the added rules put the company's open source DNA at risk. Arduino updated its ToS and privacy policy this month, which is about a month after Qualcomm announced that it's acquiring the open source hardware and software company. Among the most controversial changes is this addition: "User shall not: translate, decompile or reverse-engineer the Platform, or engage in any other activity designed to identify the algorithms and logic of the Platform's operation, unless expressly allowed by Arduino or by applicable license agreements ..."

In response to concerns from some members of the maker community, including from open source hardware distributor and manufacturer Adafruit, Arduino posted a blog on Friday. Regarding the new reverse-engineering rule, Arduino's blog said: "Any hardware, software or services (e.g. Arduino IDE, hardware schematics, tooling and libraries) released with Open Source licenses remain available as before. Restrictions on reverse-engineering apply specifically to our Software-as-a-Service cloud applications. Anything that was open, stays open."

But Adafruit founder and engineer Limor Fried and Adafruit managing editor Phillip Torrone are not convinced. They told Ars Technica that Arduino's blog leaves many questions unanswered and said that they've sent these questions to Arduino without response. "Why is reverse-engineering prohibited at all for a company built on openly hackable systems?" Fried and Torrone asked in a shared statement.
There are also concerns about the ToS' broad new AI-monitoring powers, which offer little clarity on what data is collected, who can access it, or how long it's retained. On top of that, the update introduces an unusual patent clause that bars users from using the platform to identify potential infringement by Arduino or its partners, along with sweeping, perpetual rights over user-generated content. This could allow Arduino, and potentially Qualcomm, to republish, modify, monetize, or redistribute user uploads indefinitely.
Google

NATO Taps Google For Air-Gapped Sovereign Cloud (theregister.com) 14

NATO has hired Google to provide "air-gapped" sovereign cloud services and AI in "completely disconnected, highly secure environments." From a report: The Chocolate Factory will support the military alliance's Joint Analysis, Training, and Education Centre (JATEC) in a move designed to improve its digital infrastructure and strengthen its data governance. NATO was formed in 1949 after Belgium, Canada, Denmark, France, Iceland, Italy, Luxembourg, the Netherlands, Norway, Portugal, the United Kingdom, and the United States signed the North Atlantic Treaty. Since then, 20 more European countries have joined, most recently Finland and Sweden. US President Donald Trump has criticized fellow members' financial contribution to the alliance and at times cast doubt over how likely the US is to defend its NATO allies.

In an announcement this week, Google Cloud said the "significant, multimillion-dollar contract" with the NATO Communication and Information Agency (NCIA) would offer highly secure, sovereign cloud capabilities. The agreement promises NATO "uncompromised data residency and operational controls, providing the highest degree of security and autonomy, regardless of scale or complexity," the statement said.

Google

How Google Finally Leapfrogged Rivals With New Gemini Rollout (msn.com) 38

An anonymous reader shares a report: With the release of its third version last week, Google's Gemini large language model surged past ChatGPT and other competitors to become the most capable AI chatbot, as determined by consensus industry-benchmark tests. [...] Aaron Levie, chief executive of the cloud content management company Box, got early access to Gemini 3 several days ahead of the launch. The company ran its own evaluations of the model over the weekend to see how well it could analyze large sets of complex documents. "At first we kind of had to squint and be like, 'OK, did we do something wrong in our eval?' because the jump was so big," he said. "But every time we tested it, it came out double-digit points ahead."

[...] Google has been scrambling to get an edge in the AI race since the launch of ChatGPT three years ago, which stoked fears among investors that the company's iconic search engine would lose significant traffic to chatbots. The company struggled for months to get traction. Chief Executive Sundar Pichai and other executives have since worked to overhaul the company's AI development strategy by breaking down internal silos, streamlining leadership and consolidating work on its models, employees say. Sergey Brin, one of Google's co-founders, resumed a day-to-day role at the company helping to oversee its AI-development efforts.

Programming

Microsoft and GitHub Preview New Tool That Identifies, Prioritizes, and Fixes Vulnerabilities With AI (thenewstack.io) 18

"Security, development, and AI now move as one," says Microsoft's director of cloud/AI security product marketing.

Microsoft and GitHub "have launched a native integration between Microsoft Defender for Cloud and GitHub Advanced Security that aims to address what one executive calls decades of accumulated security debt in enterprise codebases..." according to The New Stack: The integration, announced this week in San Francisco at the Microsoft Ignite 2025 conference and now available in public preview, connects runtime intelligence from production environments directly into developer workflows. The goal is to help organizations prioritize which vulnerabilities actually matter and use AI to fix them faster. "Throughout my career, I've seen vulnerability trends going up into the right. It didn't matter how good of a detection engine and how accurate our detection engine was, people just couldn't fix things fast enough," said Marcelo Oliveira, VP of product management at GitHub, who has spent nearly a decade in application security. "That basically resulted in decades of accumulation of security debt into enterprise code bases." According to industry data, critical and high-severity vulnerabilities constitute 17.4% of security backlogs, with a mean time to remediation of 116 days, said Andrew Flick, senior director of developer services, languages and tools at Microsoft, in a blog post. Meanwhile, applications face attacks as frequently as once every three minutes, Oliveira said.

The integration represents the first native link between runtime intelligence and developer workflows, said Elif Algedik, director of product marketing for cloud and AI security at Microsoft, in a blog post... The problem, according to Flick, comes down to three challenges: security teams drowning in alert fatigue while AI rapidly introduces new threat vectors that they have little time to understand; developers lacking clear prioritization while remediation takes too long; and both teams relying on separate, nonintegrated tools that make collaboration slow and frustrating... The new integration works bidirectionally. When Defender for Cloud detects a vulnerability in a running workload, that runtime context flows into GitHub, showing developers whether the vulnerability is internet-facing, handling sensitive data or actually exposed in production. This is powered by what GitHub calls the Virtual Registry, which creates code-to-runtime mapping, Flick said...

In the past, this alert would age in a dashboard while developers worked on unrelated fixes because they didn't know this was the critical one, he said. Now, a security campaign can be created in GitHub, filtering for runtime risk like internet exposure or sensitive data, notifying the developer to prioritize this issue.

GitHub Copilot "now automatically checks dependencies, scans for first-party code vulnerabilities and catches hardcoded secrets before code reaches developers," the article points out — but GitHub's VP of product management says this takes things even further.

"We're not only helping you fix existing vulnerabilities, we're also reducing the number of vulnerabilities that come into the system when the level of throughput of new code being created is increasing dramatically with all these agentic coding agent platforms."

Slashdot Top Deals