AMD

AMD Unveils Radeon RX 6000M Mobile GPUs For New Breed of All-AMD Gaming Laptops (hothardware.com) 15

MojoKid writes: AMD just took the wraps off its new line of Radeon RX 6000M GPUs for gaming laptops. Combined with its Ryzen 5000 series processors, the company claims all-AMD powered "AMD Advantage" machines will deliver new levels of performance, visual fidelity and value for gamers. AMD unveiled three new mobile GPUs. Sitting at the top is the Radeon RX 6800M, featuring 40 compute units, 40 ray accelerators, a 2,300MHz game clock and 12GB of GDDR6 memory. According to AMD, its flagship Radeon RX 6800M mobile GPU can deliver 120 frames per second at 1440p with a blend of raytracing, compute, and traditional effects.

Next, the new Radeon RX 6700M sports 36 compute units, 36 ray accelerators, a 2,300MHz game clock and 10GB of GDDR6 memory. Finally, the Radeon RX 6600M comes armed with 28 compute units and 28 ray accelerators, a 2,177MHz game clock and 8GB of GDDR6 memory. HotHardware has a deep dive review of a new ASUS ROG Strix G15 gaming laptop with the Radeon RX 6800M on board, as well as an 8-core Ryzen 9 5900HX processor. In the benchmarks, the Radeon RX 6800M-equipped machine puts up numbers that rival GeForce RTX 3070 and 3080 laptop GPUs in traditional rasterized game engines, though it trails a bit in ray tracing enhanced gaming. You can expect this new breed of all-AMD laptops to arrive in market sometime later this month.

Businesses

Instacart Bets on Robots To Shrink Ranks of 500,000 Gig Shoppers (bloomberg.com) 43

Instacart has an audacious plan to replace its army of gig shoppers with robots -- part of a long-term strategy to cut costs and put its relationship with supermarket chains on a sustainable footing. From a report: The plan, detailed in documents reviewed by Bloomberg, involves building automated fulfillment centers around the U.S., where hundreds of robots would fetch boxes of cereal and cans of soup while humans gather produce and deli products. Some facilities would be attached to existing grocery stores while larger standalone centers would process orders for several locations, according to the documents, which were dated July and December.

Despite working on the strategy for more than a year, however, the company has yet to sign up a single supermarket chain. Instacart had planned to begin testing the fulfillment centers later this year, the documents show. But the company has fallen behind schedule, according to people familiar with the situation. And though the documents mention asking several automation providers to build the technology, Instacart hasn't settled on any, said the people, who requested anonymity to discuss a private matter. In February, the Financial Times reported on elements of the strategy and said Instacart in early 2020 sent out requests for proposals to five robotics companies.

An Instacart spokeswoman said the company was busy buttressing its operations during the pandemic, when it signed up 300,000 new gig workers in a matter of weeks, bringing the current total to more than 500,000. But the delays in getting the automation strategy off the ground could potentially undermine plans to go public this year. Investors know robots will play a critical role in modernizing the $1.4 trillion U.S. grocery industry.

Hardware

The GeForce RTX 3080 Ti is Nvidia's 'New Gaming Flagship' (pcworld.com) 60

Nvidia officially announced the long-awaited GeForce RTX 3080 Ti during its Computex keynote late Monday night, and this $1,200 graphics card looks like an utter beast. The $600 GeForce RTX 3070 Ti also made its debut with faster GDDR6X memory. From a report: All eyes are on the RTX 3080 Ti, though. Nvidia dubbed it GeForce's "new gaming flagship" as the $1,500 RTX 3090 is built for work and play alike, but the new GPU is a 3090 in all but name (and memory capacity). While Nvidia didn't go into deep technical details during the keynote, the GeForce RTX 3080 Ti's specifications page shows it packing a whopping 10,240 CUDA cores -- just a couple hundred less than the 3090's 10,496 count, but massively more than the 8,704 found in the vanilla 3080.

Expect this card to chew through games on par with the best, especially in games that support real-time ray tracing and Nvidia's amazing DLSS feature. The memory system can handle the ride, as it's built using the RTX 3090's upgraded bones. The GeForce RTX 3080 Ti comes with a comfortable 12GB of blazing-fast GDDR6X memory over a wide 384-bit bus, which is half the ludicrous 24GB capacity found in the 3090, but more than enough to handle any gaming workload you through at it. That's not true with the vanilla RTX 3080, which comes with 10GB of GDDR6X over a smaller bus, as rare titles (like Doom Eternal) can already use more than 10GB of memory when you're playing at 4K resolution with the eye candy cranked to the max. The extra two gigs make the RTX 3080 Ti feel much more future-proof.

Data Storage

Seagate 'Exploring' Possible New Line of Crypto-Specific Hard Drives (techradar.com) 47

In a Q&A with TechRadar, storage hardware giant Seagate revealed it is keeping a close eye on the crypto space, with a view to potentially launching a new line of purpose-built drives. From the report: Asked whether companies might develop storage products specifically for cryptocurrency use cases, Jason M. Feist, who heads up Seagate's emerging products arm, said it was a "possibility." Feist said he could offer no concrete information at this stage, but did suggest the company is "exploring this opportunity and imagines others may be as well."
Intel

Intel's latest 11th Gen Processor Brings 5.0GHz Speeds To Thin and Light Laptops (theverge.com) 51

Intel made a splash earlier in May with the launch of its first 11th Gen Tiger Lake H-series processors for more powerful laptops, but at Computex 2021, the company is also announcing a pair of new U-series chips -- one of which marks the first 5.0GHz clock speed for the company's U-series lineup of lower voltage chips. From a report: Specifically, Intel is announcing the Core i7-1195G7 -- its new top of the line chip in the U-series range -- and the Core i5-1155G7, which takes the crown of Intel's most powerful Core i5-level chip, too. Like the original 11th Gen U-series chips, the new chips operate in the 12W to 28W range. Both new chips are four core / eight thread configurations, and feature Intel's Iris Xe integrated graphics (the Core i7-1195G7 comes with 96 EUs, while the Core i5-1155G7 has 80 EUs.)

The Core i7-1195G7 features a base clock speed of 2.9GHz, but cranks up to a 5.0GHz maximum single core speed using Intel's Turbo Boost Max 3.0 technology. The Core i5-1155G7, on the other hand, has a base clock speed of 2.5GHz and a boosted speed of 4.5GHz. Getting to 5GHz out of the box is a fairly recent development for laptop CPUs, period: Intel's first laptop processor to cross the 5GHz mark arrived in 2019.

Supercomputing

World's Fastest AI Supercomputer Built from 6,159 NVIDIA A100 Tensor Core GPUs (nvidia.com) 57

Slashdot reader 4wdloop shared this report from NVIDIA's blog, joking that maybe this is where all NVIDIA's chips are going: It will help piece together a 3D map of the universe, probe subatomic interactions for green energy sources and much more. Perlmutter, officially dedicated Thursday at the National Energy Research Scientific Computing Center (NERSC), is a supercomputer that will deliver nearly four exaflops of AI performance for more than 7,000 researchers. That makes Perlmutter the fastest system on the planet on the 16- and 32-bit mixed-precision math AI uses. And that performance doesn't even include a second phase coming later this year to the system based at Lawrence Berkeley National Lab.

More than two dozen applications are getting ready to be among the first to ride the 6,159 NVIDIA A100 Tensor Core GPUs in Perlmutter, the largest A100-powered system in the world. They aim to advance science in astrophysics, climate science and more. In one project, the supercomputer will help assemble the largest 3D map of the visible universe to date. It will process data from the Dark Energy Spectroscopic Instrument (DESI), a kind of cosmic camera that can capture as many as 5,000 galaxies in a single exposure. Researchers need the speed of Perlmutter's GPUs to capture dozens of exposures from one night to know where to point DESI the next night. Preparing a year's worth of the data for publication would take weeks or months on prior systems, but Perlmutter should help them accomplish the task in as little as a few days.

"I'm really happy with the 20x speedups we've gotten on GPUs in our preparatory work," said Rollin Thomas, a data architect at NERSC who's helping researchers get their code ready for Perlmutter. DESI's map aims to shed light on dark energy, the mysterious physics behind the accelerating expansion of the universe.

A similar spirit fuels many projects that will run on NERSC's new supercomputer. For example, work in materials science aims to discover atomic interactions that could point the way to better batteries and biofuels. Traditional supercomputers can barely handle the math required to generate simulations of a few atoms over a few nanoseconds with programs such as Quantum Espresso. But by combining their highly accurate simulations with machine learning, scientists can study more atoms over longer stretches of time. "In the past it was impossible to do fully atomistic simulations of big systems like battery interfaces, but now scientists plan to use Perlmutter to do just that," said Brandon Cook, an applications performance specialist at NERSC who's helping researchers launch such projects. That's where Tensor Cores in the A100 play a unique role. They accelerate both the double-precision floating point math for simulations and the mixed-precision calculations required for deep learning.

Graphics

Resale Prices Triple for NVIDIA Chips as Gamers Compete with Bitcoin Miners (yahoo.com) 108

"In the niche world of customers for high-end semiconductors, a bitter feud is pitting bitcoin miners against hardcore gamers," reports Quartz: At issue is the latest line of NVIDIA graphics cards — powerful, cutting-edge chips with the computational might to display the most advanced video game graphics on the market. Gamers want the chips so they can experience ultra-realistic lighting effects in their favorite games. But they can't get their hands on NVIDIA cards, because miners are buying them up and adapting them to crunch cryptographic codes and harvest digital currency. The fierce competition to buy chips — combined with a global semiconductor shortage — has driven resale prices up as much as 300%, and led hundreds of thousands of desperate consumers to sign up for daily raffles for the right to buy chips at a significant mark-up.

To broker a peace between its warring customers, NVIDIA is, essentially, splitting its cutting-edge graphics chips into two dumbed-down products: GeForce for gamers and the Cryptocurrency Mining Processor (CMP) for miners. GeForce is the latest NVIDIA graphics card — except key parts of it have been slowed down to make it less valuable for miners racing to solve crypto puzzles. CMP is based on a slightly older version of NVIDIA's graphics card which has been stripped of all of its display outputs, so gamers can't use it to render graphics.

NVIDIA's goal in splitting its product offerings is to incentivize miners to only buy CMP chips, and leave the GeForce chips for the gamers. "What we hope is that the CMPs will satisfy the miners...[and] steer our GeForce supply to gamers," said CEO Jansen Huang on a May 26 conference call with investors and analysts... It won't be easy to keep the miners at bay, however. NVIDIA tried releasing slowed-down graphics chips in February in an effort to deter miners from buying them, but it didn't work. The miners quickly figured out how to hack the chips and make them perform at full-speed again.

Power

Is Natural Gas (Mostly) Good for Global Warming? (ieee.org) 139

Natural gas "creates less carbon emissions than the coal it replaces, but we have to find ways to minimize the leakage of methane."

That's the opinion of Vaclav Smil, a distinguished professor emeritus at the University of Manitoba and a Fellow of the Royal Society of Canada, writing in IEEE's Spectrum (in an article shared by Slashdot reader schwit1): Natural gas is abundant, low-cost, convenient, and reliably transported, with low emissions and high combustion efficiency. Natural-gas-fired heating furnaces have maximum efficiencies of 95 to 97 percent, and combined-cycle gas turbines now achieve overall efficiency slightly in excess of 60 percent. Of course, burning gas generates carbon dioxide, but the ratio of energy to carbon is excellent: Burning a gigajoule of natural gas produces 56 kilograms of carbon dioxide, about 40 percent less than the 95 kg emitted by bituminous coal.

This makes gas the obvious replacement for coal. In the United States, this transition has been unfolding for two decades. Gas-fueled capacity increased by 192 gigawatts from 2000 to 2005 and by an additional 69 GW from 2006 through the end of 2020. Meanwhile, the 82 GW of coal-fired capacity that U.S. utilities removed from 2012 to 2020 is projected to be augmented by another 34 GW by 2030, totaling 116 GW — more than a third of the former peak rating.

So far, so green. But methane is itself a very potent greenhouse gas, packing from 84 to 87 times as much global warming potential as an equal quantity of carbon dioxide when measured over 20 years (and 28 to 36 times as much over 100 years). And some of it leaks out. In 2018, a study of the U.S. oil and natural-gas supply chain found that those emissions were about 60 percent higher than the Environmental Protection Agency had estimated. Such fugitive emissions, as they are called, are thought to be equivalent to 2.3 percent of gross U.S. gas production...

Without doubt, methane leakages during extraction, processing, and transportation do diminish the overall beneficial impact of using more natural gas, but they do not erase it, and they can be substantially reduced.

China

China's 'Artificial Sun' Fusion Reactor Just Set a New World Record (scmp.com) 90

The South China Morning Post reports that China "has reached another milestone in its quest for a fusion reactor, with one of its 'artificial suns' sustaining extreme temperatures for several times longer that its previous benchmark, according to state media." State news agency Xinhua reported that the Experimental Advanced Superconducting Tokamak in a facility in the eastern city of Hefei registered a plasma temperature of 120 million degrees Celsius for 101 seconds on Friday. It also maintained a temperature of 160 million degrees Celsius for 20 seconds, the report said...

The facilities are part of China's quest for fusion reactors, which hold out hope of unlimited clean energy. But there are many challenges to overcome in what has already been a decades-long quest for the world's scientists. Similar endeavours are under way in the United States, Europe, Russia, South Korea. China is also among 35 countries involved in the International Thermonuclear Experimental Reactor (ITER) megaproject in France...

Despite the progress made, fusion reactors are still a long way from reality. Song Yuntao, director of the Institute of Plasma Physics of the Chinese Academy of Sciences, said the latest results were a major achievement for physics and engineering in China. "The experiment's success lays the foundation for China to build its own nuclear fusion energy station," Song was quoted as saying.

NASA notes that the core of the Sun is only about 15 million degrees Celsius.

So for many seconds China's fusion reactor was more than 10 times hotter than the sun.
Australia

Robots and AI Will Guide Australia's First Fully Automated Farm (abc.net.au) 41

"Robots and artificial intelligence will replace workers on Australia's first fully automated farm," reports Australia's national public broadcaster ABC.

The total cost of the farm's upgrade? $20 million. Charles Sturt University in Wagga Wagga will create the "hands-free farm" on a 1,900-hectare property to demonstrate what robots and artificial intelligence can do without workers in the paddock... The farm will use robotic tractors, harvesters, survey equipment and drones, artificial intelligence that will handle sowing, dressing and harvesting, new sensors to measure plants, soils and animals and carbon management tools to minimise the carbon footprint.

The farm is already operated commercially and grows a range of broadacre crops, including wheat, canola, and barley, as well as a vineyard, cattle and sheep.

Power

Could Zinc Batteries Replace Lithium-Ion Batteries on the Power Grid? (sciencemag.org) 120

Slashdot reader sciencehabit shares Science magazine's look at efforts to transform zinc batteries "from small, throwaway cells often used in hearing aids into rechargeable behemoths that could be attached to the power grid, storing solar or wind power for nighttime or when the wind is calm." With startups proliferating and lab studies coming thick and fast, "Zinc batteries are a very hot field," says Chunsheng Wang, a battery expert at the University of Maryland, College Park. Lithium-ion batteries — giant versions of those found in electric vehicles — are the current front-runners for storing renewable energy, but their components can be expensive. Zinc batteries are easier on the wallet and the planet — and lab experiments are now pointing to ways around their primary drawback: They can't be recharged over and over for decades.

For power storage, "Lithium-ion is the 800-pound gorilla," says Michael Burz, CEO of EnZinc, a zinc battery startup. But lithium, a relatively rare metal that's only mined in a handful of countries, is too scarce and expensive to back up the world's utility grids. (It's also in demand from automakers for electric vehicles.) Lithium-ion batteries also typically use a flammable liquid electrolyte. That means megawatt-scale batteries must have pricey cooling and fire-suppression technology. "We need an alternative to lithium," says Debra Rolison, who heads advanced electrochemical materials research at the Naval Research Laboratory. Enter zinc, a silvery, nontoxic, cheap, abundant metal. Nonrechargeable zinc batteries have been on the market for decades. More recently, some zinc rechargeables have also been commercialized, but they tend to have limited energy storage capacity. Another technology — zinc flow cell batteries — is also making strides. But it requires more complex valves, pumps, and tanks to operate. So, researchers are now working to improve another variety, zinc-air cells...

Advances are injecting new hope that rechargeable zinc-air batteries will one day be able to take on lithium. Because of the low cost of their materials, grid-scale zinc-air batteries could cost $100 per kilowatt-hour, less than half the cost of today's cheapest lithium-ion versions. "There is a lot of promise here," Burz says. But researchers still need to scale up their production from small button cells and cellphone-size pouches to shipping container-size systems, all while maintaining their performance, a process that will likely take years.

Hardware

Apps Reportedly Limited To Maximum of 5GB RAM In iPadOS, Even With 16GB M1 iPad Pro (macrumors.com) 159

Despite Apple offering the M1 iPad Pro in configurations with 8GB and 16GB of RAM, developers are now indicating that apps are limited to just 5GB of RAM usage, regardless of the configuration the app is running on. MacRumors reports: The M1 iPad Pro comes in two memory configurations; the 128GB, 256GB, and 512GB models feature 8GB of RAM, while the 1TB and 2TB variants offer 16GB of memory, the highest ever in an iPad. Even with the unprecedented amount of RAM on the iPad, developers are reportedly severely limited in the amount they can actually use. Posted by the developer behind the graphic and design app Artstudio Pro on the Procreate Forum, apps can only use 5GB of RAM on the new M1 iPad Pros. According to the developer, attempting to use anymore will cause the app to crash: "There is a big problem with M1 iPad Pro. After making stress test and other tests on new M1 iPad Pro with 16GB or RAM, it turned out that app can use ONLY 5GB or RAM! If we allocate more, app crashes. It is only 0.5GB more that in old iPads with 6GB of RAM! I suppose it isn't better on iPad with 8GB." Following the release of its M1-optimized app, Procreate also noted on Twitter that with either 8GB or 16GB of available RAM, the app is limited by the amount of RAM it can use.
Hardware

More People Are Buying Wearables Than Ever Before (arstechnica.com) 76

An anonymous reader quotes a report from Ars Technica: The wearables category of consumer devices -- which includes smartwatches, fitness trackers, and augmented reality glasses -- shipped more than 100 million units in the first quarter for the first time, according to research firm IDC. Q2 2021 saw a 34.4 percent increase in sales over the same quarter in 2020. To be clear: wearables have sold that many (and more) units in a quarter before, but never in the first quarter, which tends to be a slow period following a spree of holiday-related buying in Q4.

According to IDC's data, Apple leads the market by a significant margin, presumably thanks to the Apple Watch. In Q1 2021, Apple had a market share of 28.8 percent. Samsung sat in a distant second at 11.3 percent, followed by Xiaomi at 9.7 percent and Huawei at 8.2. From there, it's a steep drop to the smaller players -- like BoAt, which has a market share of just 2.9 percent. However, analysts say upstarts or smaller companies like BoAt are driving the significant year-over-year growth for wearables. IDC's report says that the fastest growth comes from form factors besides smartwatches, such as digitally connected rings, audio glasses, and wearable patches. This grab-bag subcategory within wearables, which the IDC simply classifies as "other," actually grew 55 percent year-over-year.

Power

USB-C Power Upgrade Delivers a Whopping 240W for Gaming Laptops and Other Devices (cnet.com) 110

AmiMoJo writes: The USB-C standard will let you plug in power-hungry devices like gaming laptops, docking stations, 4K monitors and printers with an upgrade that accommodates up to 240 watts starting this year. The jump in maximum power is more than double today's 100-watt top capacity. The USB Implementers Forum, the industry group that develops the technology, revealed the new power levels in the version 2.1 update to its USB Type-C specification on Tuesday. The new 240-watt option is called Extended Power Range, or EPR. "We expect devices supporting higher wattages in the second half of 2021," USB-IF said in a statement.

USB began as a useful but limited port for plugging keyboards, mice and printers into PCs. It later swept aside Firewire and other ports as faster speeds let it tackle more demanding tasks. It proved useful for charging phones as the mobile revolution began, paving the way for its use delivering power, not just data. The 240W Extended Power Range option means USB likely will expand its turf yet again. Cables supporting 240 watts will have additional requirements to accommodate the new levels. And USB-IF will require the cables to bear specific icons "so that end users will be able to confirm visually that the cable supports up to...240W," USB-IF said in the specification document.

Power

Joe Biden Opens Up California Coast To Offshore Wind (theverge.com) 232

An anonymous reader quotes a report from The Verge: Offshore wind is headed west. The Biden administration announced today that it will open up parts of the Pacific coast to commercial-scale offshore renewable energy development for the first time. The geography of the West Coast poses huge technical challenges for wind energy. But rising to meet those challenges is a big opportunity for both President Joe Biden and California Governor Gavin Newsom to meet their clean energy goals. There are two areas now slotted for development off the coast of Central and Northern California -- one at Morro Bay and another near Humboldt County. Together, these areas could generate up to 4.6GW of energy, enough power for 1.6 million homes over the next decade, according to a White House fact sheet.

Compared to the East Coast, waters off the West Coast get deeper much faster. That has stymied offshore wind development. So the White House says it's looking into deploying pretty futuristic technology there: floating wind farms. Until now, technical constraints have generally prevented companies from installing turbines that are fixed to the seafloor in waters more than 60 meters deep. That's left nearly 60 percent of offshore wind resources out of reach, according to the National Renewable Energy Laboratory (NREL). With the development of new technologies that could let wind turbines float in deeper waters, it looks like those resources might finally be within reach.

The Department of Energy says that it has funneled more than $100 million into moving floating offshore wind technology forward. There are only a handful of floating turbines in operation today, and no commercial-scale wind farms yet anywhere in the world. The Bureau of Ocean Energy Management still needs to officially designate the areas off the California coast as Wind Energy Areas for development and complete an environmental analysis. The plan is to auction off leases for the area to developers in mid-2022. It's also working with the Department of Defense to make sure the projects don't interfere with its ongoing "testing, training, and operations" off California's coast.

Bitcoin

Bitcoin Mining Council To Report Renewable Energy Usage (bbc.com) 183

A new Bitcoin Mining Council has been created to improve the crypto-currency's sustainability, following a meeting of "leading" Bitcoin miners and Elon Musk. The BBC reports: It's hoped the council will "promote energy usage transparency" and encourage miners to use renewable sources. According to a tweet by MicroStrategy CEO Michael Saylor, who convened the meeting of the group and Elon Musk, the council includes "the leading Bitcoin miners in North America." But research from a group of universities suggested that China accounted for more than 75% of Bitcoin mining as of April 2020. The authors estimated that 40% of China's Bitcoin mines were powered by coal.

[T]he group needs to do more than "disclosing and promoting the use of renewables," Alex de Vries of the website Digiconomist told the BBC. "Even if we had disclosure, that doesn't change the natural incentive of these miners to search out the cheapest and most constant sources of power - which typically comes down to (obsolete) fossil fuels," he said. "Kentucky even came up with a tax break for Bitcoin miners to come and use their obsolete coalfields. So, I'm not seeing this trend towards more renewables." However council member Peter Wall, Chief Executive of Argo, argued that increasingly US Bitcoin miners were choosing renewable power. He felt the council could encourage change."It's early days, it's embryonic. There will be lots of discussions moving forward about the best way to promote sustainable Bitcoin mining and to do it not just in North America," he said.

AI

Synopsys Claims Chip Design Breakthrough With AI Engineering (forbes.com) 31

MojoKid writes: Mountain View, CA silicon design tools heavyweight Synopsys is claiming a breakthrough in chip design automation that it claims will usher in a new level of semiconductor innovation that will take the industry above and beyond the limits of Moore's Law (Gordon Moore's observation that the number of transistors in chips double roughly every two years), which is now considered by many to be plateauing. Synopsys' tool called DSO.ai is the world's first autonomous AI tool set for chip design. Synopsys claims its DSO.ai tool can dramatically accelerate, enhance, and reduce the costs involved with something called place-and-route. Just as it sounds, place-and-route (sometimes called floor planning) referrers to the placement of logic and IP blocks, and the routing of the traces and various interconnects in a chip designed to join them all together. Synopsys' DSO.ai optimizes and streamlines this process using the iterative nature of artificial intelligence and machine learning, such that what used to take dozens of engineers weeks or potentially months, now will take a junior engineer just days to complete. DSO.ai iterates on the floorplan and layout of a chip, and learns from each iteration, fine tuning and optimizing the chip within its design parameters and targets along the way. The old semiconductor paradigms are rapidly becoming a thing of the past. Today, it's about the best transistors, architectures, and accelerators for the job, and the human-constrained physical design engineering effort no longer has to be a gating factor.
AMD

AMD Eyes Major Socket Change (pcgamer.com) 92

An anonymous reader quotes a report from PC Gamer: According to a tweet from Executable Fix, a well-known leaker, AMD will finally move away from PGA to LGA with the shift to AM5, the new socket set to replace AM4. They say the new socket design will be LGA-1718 -- the number representing the number of pins required for the package. They also note that a coming generation of AMD chip will support DDR5 and PCIe 4.0 with a 600-series chipset.

When we talk about PGA, we're most often discussing processors with pins sticking out the underside of a chip that slot into a motherboard with a compatible socket. An LGA design will instead see a flat array of connection points on the processor, which will align with pins within the motherboard's socket. Either way you look at it, you're getting some very bendable, if not breakable, pins. But in my opinion it's much easier to bend those pins on the CPU. While a shift to LGA may seem somewhat trivial, the change will mark a major shakeup in AMD's desktop lineup.

Data Storage

Apple's Moves Point To a Future With No Bootable Backups, Says Developer (appleinsider.com) 105

The ability to boot from an external drive on an Apple Silicon Mac may not be an option for much longer, with the creation and use of the drives apparently being phased out by Apple, according to developers of backup tools. Apple Insider reports: Mike Bombich, the founder of Bombich Software behind Carbon Copy Cloner, wrote in a May 19 blog post that the company will continue to make bootable backups for both Intel and Apple Silicon Macs, and will "continue to support that functionality as long as macOS supports it." However, with changes in the way a Mac functions with the introduction of Apple Silicon, the ability to use external booting could be limited, in part due to Apple's design decisions.

The first problem is with macOS Big Sur, as Apple made it so macOS resides on a "cryptographically sealed Signed System Volume," which could only be copied by Apple Software Restore. While CCC has experience with ASR, the tool was deemed to be imperfect, with it failing "with no explanation" and operating in a "very one-dimensional" way. The second snag was Apple Fabric, a storage system that uses per-file encryption keys. However, ASR didn't work for months until the release of macOS 11.3 restored it, but even then kernel panics ensued when cloning back to the original internal storage.

In December, Bombich spoke to Apple about ASR's reliability and was informed that Apple was working to resolve the problem. During the call, Apple's engineers also said that copying macOS system files was "not something that would be supportable in the future." "Many of us in the Mac community could see that this was the direction Apple was moving, and now we finally have confirmation," writes Bombich. "Especially since the introduction of APFS, Apple has been moving towards a lockdown of macOS system files, sacrificing some convenience for increased security." [...] While CCC won't drop the ability to copy the System folder, the tool is "going to continue to offer it with a best effort' approach." Meanwhile, for non-bootable data restoration, CCC's backups do still work with the macOS Migration Assistant, available when booting up a new Mac for the first time.

Hardware

Qualcomm Refreshes Snapdragon 7c Chip for PCs and Chromebooks (engadget.com) 17

In late 2019, Qualcomm announced the Snapdragon 8c and 7c, a pair of affordable chips for always-on Windows 10 PCs and Chromebooks. Today, the company is updating the latter of those two SoCs to improve performance. Engadget: The Snapdragon 7c Gen 2 features a Kyro CPU that can achieve clock speeds of up to 2.55GHz. The company claims it delivers 10 percent faster performance than "most competing platforms." Qualcomm likely has processors from Intel's Gemini Lake family in mind here. The company also claims the 7c Gen 2 can deliver up to two times the battery life of its competitors. Outside of the faster CPU, the 7c Gen 2 is more or less the same chip Qualcomm announced in 2019. It features an Adreno 618 GPU and Snapdragon X15 LTE modem. The latter allows the 7c Gen 2 to hit theoretical download speeds of 800 Mbps. As with its predecessor, the chip is designed for education and price-conscious customers. According to Qualcomm, we can expect the first Snapdragon 7c Gen 2 laptops to arrive this summer, with the first models coming from Lenovo.

Slashdot Top Deals