Apple Defends Mac Mini Power Button Relocation

Apple executives have defended the relocation of the power button to the bottom of its new M4 Mac mini, citing the computer's significantly reduced size as the driving factor behind the design change. In a Bilibili video interview, Apple's Greg Joswiak and John Ternus explained that the Mac mini's form factor, now half the size of its predecessor, necessitated finding a new position for the power button. The executives said that the bottom placement allows for convenient access despite initial user criticism. Read more of this story at Slashdot.

AI Companies Hit Development Hurdles in Race for Advanced Models

OpenAI's latest large language model, known internally as Orion, has fallen short of performance targets, marking a broader slowdown in AI advancement across the industry's leading companies, according to Bloomberg, corroborating similar media stories in recent days. The model, which completed initial training in September, showed particular weakness in novel coding tasks and failed to demonstrate the same magnitude of improvement over its predecessor as GPT-4 achieved over GPT-3.5, the publication reported Wednesday. Google's upcoming Gemini software and Anthropic's Claude 3.5 Opus are facing similar challenges. Google's project is not meeting internal benchmarks, while Anthropic has delayed its model's release, Bloomberg said. Industry insiders cited by the publication pointed to growing scarcity of high-quality training data and mounting operational costs as key obstacles. OpenAI's Orion specifically struggled due to insufficient coding data for training, the report said. OpenAI has moved Orion into post-training refinement but is unlikely to release the system before early 2024. The report adds: [...] AI companies continue to pursue a more-is-better playbook. In their quest to build products that approach the level of human intelligence, tech firms are increasing the amount of computing power, data and time they use to train new models -- and driving up costs in the process. Amodei has said companies will spend $100 million to train a bleeding-edge model this year and that amount will hit $100 billion in the coming years. As costs rise, so do the stakes and expectations for each new model under development. Noah Giansiracusa, an associate professor of mathematics at Bentley University in Waltham, Massachusetts, said AI models will keep improving, but the rate at which that will happen is questionable. "We got very excited for a brief period of very fast progress," he said. "That just wasn't sustainable." Further reading: OpenAI and Others Seek New Path To Smarter AI as Current Methods Hit Limitations. Read more of this story at Slashdot.