Journalist
AJP
-
CES 2026: PlayStation on Honda self-driving EVs to be possible on U.S. roads by late 2026 LAS VEGAS, January 05 (AJP) - Imagine cruising down the highway while battling mythical beasts in God of War Ragnarök or listening to bespoke soundscapes by a Japanese music producer — all from inside a self-driving electric vehicle. That future rolled into CES 2026 on Monday as Sony Honda Mobility unveiled its first production-ready model, the Afeela 1, blurring the line between automobile and entertainment hub. The four-year-old joint venture between Sony Group and Honda Motor said customer deliveries in California will begin later this year, with prices starting at $89,000. "Mobility will evolve into an experience that understands every user," said Izumi Kawanishi, representative director and president of Sony Honda Mobility. "Being a car will no longer be about driving it. It will be about making the most of your time and space." Afeela 1 packs 40 sensors and computing power capable of up to 800 trillion operations per second. It launches with Level 2+ advanced driver assistance under "Afeela Intelligent Drive," with the company aiming for Level 4 autonomous capability over time. One signature feature is the integration of PlayStation Remote Play, enabling occupants to stream games from a PS5 directly to the vehicle's panoramic display. Eric Lempel, senior vice president at Sony Interactive Entertainment, also announced exclusive Afeela themes from Astro Bot, including custom wallpapers and sounds. "We want PlayStation to be the best place to play and to give players more ways than ever to access the games they love," Lempel said. "We're excited to bring Remote Play to Afeela." The vehicle's electrical architecture is powered by Qualcomm Technologies via the Snapdragon Digital Chassis. Nakul Duggal, executive vice president at Qualcomm, called the collaboration a shared bet on the intelligent car's future. Afeela's conversational AI — the Afeela Personal Agent — runs on Microsoft Azure OpenAI Service, enabling personalized, natural dialogue tuned to driver preferences and context. Adding an artistic layer, Grammy-nominated Japanese producer Tomoko Ida crafted a custom audio experience that turns electric acceleration into a musical instrument, blending Japanese and Western styles rather than mimicking traditional engine sounds. Yasuhide Mizuno, representative director, chairperson and CEO, confirmed that trial production began last fall at Honda's East Liberty Auto Plant in Ohio, with pre-production vehicles on display at CES. Sales are initially limited to California, with delivery hubs opening this spring in Torrance and Fremont. Demo drives for early reservation holders are planned later this year. Expansion to Arizona follows in 2027, while Japan deliveries are slated for the first half of the same year. Sony Honda Mobility also debuted the Afeela Prototype 2026, a compact SUV-style concept expected to reach U.S. roads as early as 2028. 2026-01-06 13:30:08 -
Nvidia AI chip for self-driving to test on Mercedes Benz in US roads Q1 LAS VEGAS, January 06 (AJP) -Nvidia unveiled a next-generation “reasoning AI” for autonomous driving at CES 2026 on Monday, signaling a deeper push into the self-driving vehicle market with real-world deployments planned as early as the first quarter of this year. At a press conference at the Fontainebleau Live Theater in Las Vegas, Nvidia CEO Jensen Huang introduced the technology, dubbed Alpamayo, describing it as “the world’s first self-driving AI that thinks and reasons.” “The ChatGPT era of physical AI has arrived,” Huang said onstage. “Machines have begun to understand the real world directly and to reason and act on their own.” Huang said Alpamayo is designed to address autonomous driving’s so-called “long-tail” problem — rare, complex scenarios that are difficult to anticipate through conventional data collection alone. Unlike traditional systems that react primarily to sensor inputs, Alpamayo reasons step by step, Huang said, and can explain why it chose a particular action. “Alpamayo not only takes sensor input to control steering, braking and acceleration, it also reasons about what action to take and tells you why,” he said. “It can respond safely even in complex long-tail scenarios.” The model is a vision-language-action system built on Nvidia’s physical AI platform, Cosmos, which the company describes as a world foundation model. Nvidia said Alpamayo was trained using human driving demonstrations and trillions of miles of AI-generated data. During the presentation, Huang showcased a live demonstration of a vehicle equipped with Alpamayo navigating downtown San Francisco, verbally explaining its decision-making process in real time. “The long tail of driving can’t be solved by collecting every scenario in the real world,” Huang said. “But it can be solved by breaking it down into smaller, normal situations and reasoning through them.” A video shown onstage featured an AI-powered Mercedes-Benz vehicle driving through San Francisco traffic while a passenger sat behind the steering wheel with their hands resting in their lap. “It drives so naturally because it learned directly from human demonstrators,” Huang said. “In every single scenario, it reasons about what it’s going to do and why.” Nvidia said it plans to release Alpamayo as open source, with the underlying code made available on the machine-learning platform Hugging Face, allowing autonomous-vehicle researchers to access and retrain the model. Several companies and research groups have already lined up to use Alpamayo for Level 4 autonomous driving development, including Lucid Motors, Jaguar Land Rover, Uber and Berkeley DeepDrive. “Robotaxis will be the first to benefit,” Huang said, adding that Mercedes-Benz vehicles equipped with Alpamayo are expected to begin operating on U.S. roads in the first quarter. Nvidia also said it is planning a robotaxi pilot service in 2027 with partners including Uber. The company declined to name additional partners or specify locations for the rollout. “Our vision is that someday, every single car, every single truck, will be autonomous,” Huang said. 2026-01-06 13:29:17 -
HOT STOCK: Hyundai Motor jumps over 4% early Tuesday on robotics timeline SEOUL, January 6 (AJP) - Shares of Hyundai Motor surged in early trading Tuesday, as investors cheered the company's newly unveiled AI and robotics roadmap at CES 2026, fueling optimism that growth engines beyond its core automotive business are taking clearer shape. As of 9:14 a.m., Hyundai Motor was trading at 317,500 won(US$235), up 13,000 won, or 4.27 percent, from the previous close of 304,500 won. The stock briefly climbed as high as 330,000 won before paring gains amid broad profit-taking. By 10:24 a.m., shares eased to 307,500 won, tracking a broader market pause after recent gains. The KOSPI slipped 0.4 percent to 4,438.4 as investors locked in profits following the index's year-end rally. The stock opened strong following Hyundai Motor Group's overnight presentation at CES in Las Vegas, where the company laid out a detailed timeline for AI-driven robotics and so-called "physical AI" technologies, underscoring its push to integrate artificial intelligence into manufacturing, mobility and industrial automation. Chung Euisun, chairman of Hyundai Motor Group, has recently described artificial intelligence as a "winnable game" for the group, citing strengths in physical manufacturing, mobility platforms and real-world data. According to industry sources, Chung has emphasized that Hyundai's ability to integrate AI across vehicles, robots and production systems gives it a structural edge over peers. At CES, the group showcased its humanoid robot Atlas, outlining plans to begin mass production in 2028 and deploy the robots at scale on U.S. assembly lines by 2030 — a timeline that analysts said helped crystallize Hyundai's commercialization strategy. Momentum was further bolstered after Hyundai's autonomous mobility robot platform MobED won the Best of Innovation Award in the robotics category at CES 2026. The accolade marked Hyundai’s first top-tier innovation award at the exhibition and was widely viewed as external validation of its robotics technology and commercialization potential. Hyundai said it plans to leverage its software-defined factory (SDF) framework to test and validate robotics technologies in live manufacturing environments before scaling them into broader industrial, commercial and daily-life applications. Analysts said the approach signals a shift from concept-driven showcases toward practical deployment and ecosystem building — improving visibility on future monetization from AI and robotics initiatives. Hyundai Motor reported on Monday that it sold 4.14 million vehicles in 2025, down 0.1 percent from 2024, with domestic sales rising 1.1 percent while overseas sales fell 0.3 percent. 2026-01-06 11:27:50 -
KOSPI takes breather while Asian markets stay broadly strong SEOUL, January 06 (AJP) - South Korean stocks took a pause on Tuesday after a relentless rally since year-end, while broader Asian markets largely maintained upward momentum following overnight gains on Wall Street, despite lingering geopolitical jitters tied to Venezuela. In Seoul, the benchmark KOSPI edged down 0.1 percent to 4,451.94 as of 11:00 a.m., as institutional investors locked in profits after the index recently scaled fresh record highs. The tech-heavy KOSDAQ also slipped 0.3 percent to 954.93. Overall market moves were measured, with investors rotating toward market leaders amid recent volatility. Gains in select industrial, defense and shipbuilding stocks helped offset profit-taking in heavyweight chipmakers. Samsung Electronics fell 2.1 percent to 135,200 won ($93.4), while SK hynix slid 1.7 percent to 684,500 won in early trade, reflecting profit-taking after a strong rally. Despite the pullback, sentiment toward chipmakers remained constructive, with analysts pointing to steady AI-driven demand. Battery and industrial shares traded mixed. LG Energy Solution, the country’s third-largest company by market capitalization, rose 1.1 percent to 375,500 won, while HD Hyundai Heavy Industries advanced 2.1 percent to 524,000 won. Hanwha Aerospace, however, slipped 1.78 percent to 994,000 won amid profit-taking after recent gains. By contrast, entertainment stocks underperformed the broader market, weighed down by lingering uncertainty over overseas content demand and regulatory risks. HYBE fell 1.9 percent to 333,500 won, while JYP Entertainment slid 1.2 percent to 71,800 won. SM Entertainment dipped 0.4 percent to 118,500 won, and YG Entertainment eased 0.2 percent to 65,000 won. In the currency market, the Korean won traded little changed against the dollar, hovering around 1,447.6 won per dollar, as investors balanced improved equity sentiment against external risk factors. In Tokyo, Japanese stocks extended gains in early trade. The Nikkei 225 rose 0.7 percent to 52,196.3, supported by advances in autos, financials and technology-related shares. Among major heavyweights, Toyota Motor climbed 2.1 percent to 3,472 yen ($22.2), while Mitsubishi UFJ Financial Group gained 2.7 percent to 2,613.5 yen on strength in financial shares. Elsewhere in Asia, market moves were more restrained. Mainland China’s Shanghai Composite added 1.4 percent to 4,023.4, while Hong Kong’s Hang Seng Index rose 0.6 percent to 26,507.3. 2026-01-06 11:26:46 -
CES 2026: Lego no longer just child's play as it goes high-tech and AI LAS VEGAS, January 05 (AJP) - Lego bricks don’t just stack anymore — they listen, light up and talk back. At CES 2026, the Lego Group unveiled AI-embedded bricks designed to bring sound, motion and real-time feedback into physical play, blurring the line between classic building toys and interactive games. At a media showcase on Monday at the world’s largest technology trade show, the Danish toymaker introduced what it calls “Smart Play,” a tech-hybrid Lego platform centered on a standard 2×4 brick embedded with sensors, LED lights and a tiny speaker. The move marks Lego’s boldest step yet into connected play as it seeks to compete on equal footing with digital-first entertainment. The Smart Brick, identical in size to a traditional Lego piece, houses a 4.1-millimeter ASIC chip — smaller than a Lego stud — that runs what the company calls the “Play Engine.” The chip allows the brick to sense motion, orientation, magnetic fields and the proximity of other Smart Bricks, enabling pieces to interact with one another in real time. “We wanted to leverage technical innovation and bring it into physical play,” said Julia Goldin, Lego Group’s chief product and marketing officer. “Kids have unlimited creativity. They have unlimited imagination.” Beyond the core brick, the Smart Play system includes Smart Minifigures and Smart Tags, each equipped with digital IDs readable through near-field magnetic communication. The bricks also feature an accelerometer, integrated copper coils and a miniature speaker that produces sounds triggered by live actions rather than pre-recorded clips. During a live demonstration, two Smart Bricks changed colors depending on how far apart they were and whether they were facing each other. A toy car revved its engine when pushed forward and screeched when sharply tilted. A Lego duck quacked while “swimming” and croaked softly when laid down to sleep. A plane roared through aerial maneuvers, responding to every twist and turn. Minifigures further altered the play experience. A civilian figure screamed when repeatedly rammed by a toy car, while a pilot figurine let out disgruntled sounds if a plane was flipped upside down. Lego said the system operates on a proprietary wireless layer called BrickNet, built on Bluetooth technology using what it calls “Neighbor Position Measurement.” Crucially, the bricks communicate directly with one another without the need for apps, internet connections or external controllers. “The way to think of this is like a tiny distributed console, but for physical play,” said Tom Donaldson, senior vice president at the Lego Group, who led the technical demonstration. “One Smart Brick can be reused to unlock a huge range of different play experiences across potentially thousands of models.” Battery performance was another focus. Lego said the Smart Bricks are designed to function even after years of inactivity, with multiple pieces capable of being charged wirelessly on a shared charging pad. In a surprise appearance, executives from Lucasfilm joined Lego on stage alongside R2-D2 and C-3PO to preview how the Star Wars franchise will anchor the first Smart Play lineup. Lego’s partnership with Lucasfilm began in 1998, when its minifigure catalog comprised just 27 characters. Today, the lineup spans more than 1,500. “The Lego brand is all about building creativity and inventing your own adventure, and this new Lego Smart innovation takes that to the next level,” said Dave Filoni, Lucasfilm’s chief creative officer. “We hope it’ll inspire a whole new generation of storytellers.” Lego plans to launch the Smart Play system with three Star Wars sets in March, with starting prices around $90. The company, which celebrated its 70th anniversary last year, said Smart Play reflects its effort to bring advanced technology into physical play while preserving the open-ended creativity and simplicity that have long defined the Lego brand. 2026-01-06 11:25:21 -
[[CES 2026]] NVIDIA CEO unveils Vera Rubin, full-stack AI platform LAS VEGAS, January 05 (AJP) -Vera Rubin - the next bar-raising AI chip platform from NVIDIA for release in the second half - is designed not as a single product, but as a full-stack platform spanning AI training, inference and deployment across large-scale systems, according to its creator Monday (local time). Unveiling the architecture, Jensen Huang, NVIDIA’s founder and chief executive, said Vera Rubin represents a shift toward platform-level computing aimed at powering end-to-end AI workloads, from model development to real-world deployment at scale. Clad in his signature leather jacket, Hwang introduced the newest product at a press conference held at the BlueLive Theater inside the Fontainebleau Hotel in Las Vegas, ahead of CES 2026 opening. Introducing the system as "One Platform for Every AI," Huang positioned Vera Rubin as a unified foundation designed to support AI training, inference and deployment at scale, rather than as a single chip or product line. Opening his keynote, Huang placed the launch within a broader cycle of technological change. "Every 10 to 15 years, the computer industry resets," he said, pointing to past shifts from PCs to the internet, and from cloud computing to mobile platforms. This time, he argued, the transition is more disruptive. "Two platform shifts are happening at once. Applications are being built on AI, and how we build software has fundamentally changed." Huang stressed that AI is no longer simply another layer in computing. Instead, it is reshaping the entire stack. "You no longer program software. You train it," he said. "You do not run it on CPUs. You run it on GPUs." Unlike traditional applications that rely on precompiled logic, AI systems now generate outputs dynamically, producing tokens and pixels in real time based on context and reasoning. That shift, Huang said, is driving an unprecedented surge in demand for computing power. As models grow larger and reasoning becomes central to performance, both training and inference workloads have expanded sharply. "Models are growing by an order of magnitude every year," he said, noting that test-time scaling has turned inference into a thinking process rather than a single response. "All of this is a computing problem. The faster you compute, the faster you reach the next frontier." Vera Rubin is NVIDIA’s response to that challenge. Named after astronomer Vera Florence Cooper Rubin, the platform is built through what NVIDIA calls extreme codesign, integrating six newly developed chips — including the Vera CPU and Rubin GPU — into a single system. According to NVIDIA, this approach enables large mixture-of-experts models to be trained with roughly 4x fewer GPUs and reduces inference token costs by up to 10x compared with the Blackwell platform. Huang highlighted the platform’s rack-scale architecture as a key departure from earlier designs. The Vera Rubin NVL72 system connects 72 GPUs using sixth-generation NVLink technology, creating internal bandwidth at a scale rarely seen in data centers. "The rack provides more bandwidth than the entire internet," Huang said, underscoring how data movement has become a limiting factor for modern AI systems. Memory was another issue Huang addressed directly. As AI models move toward multi-step reasoning and longer interactions, the amount of context they must retain has outgrown on-chip memory. Vera Rubin introduces a new inference context memory platform powered by BlueField-4 processors, allowing large volumes of key-value cache data to be stored and shared locally. Huang said the goal is to reduce network congestion while maintaining consistent inference performance at scale. Energy efficiency and system reliability were also emphasized. Despite significantly higher compute density, Huang said Vera Rubin systems are cooled using hot water at about 45 degrees Celsius, eliminating the need for energy-intensive chillers. The platform also introduces rack-scale confidential computing, encrypting data across CPUs, GPUs and interconnects to protect proprietary models during training and inference. Throughout the keynote, Huang repeatedly framed NVIDIA’s role as extending beyond chip design. "AI is a full stack," he said. "We are reinventing everything, from chips to infrastructure to models to applications." He added that this approach is intended to support what he described as the next phase of AI deployment, built around large-scale AI factories operated by cloud providers, enterprises and AI labs. NVIDIA said Rubin-based products will be rolled out through partners beginning in the second half of 2026, with cloud providers, AI labs and enterprise customers expected to be among the early adopters. 2026-01-06 11:23:39 -
[[CES 2026]] NVIDIA CEO Jensen Huang delivers keynote speech Las Vegas, January 5 (AJP) - NVIDIA CEO Jensen Huang delivered the keynote address during an NVIDIA press conference at the Fontainebleau Hotel's BlueLive Theater in Las Vegas on Monday (local time). 2026-01-06 11:09:34 -
Excitement builds ahead of BTS' full-group comeback SEOUL, January 6 (AJP) - Excitement is building for K-pop juggernaut BTS' full-group comeback in late March, with pre-events and other promotional activities already underway in Seoul. Promotional displays for the septet's fifth album, set for release on March 20, covered the staircases to the main hall of the Sejong Center for the Performing Arts near Gwanghwamun in central Seoul on Monday. In a press release, their management agency Big Hit Music said, "BTS began in Seoul and have since expanded their presence globally. As they reunite as a full group after a long time, we arranged some offline events in the heart of Seoul, where their cultural roots are." The agency also plans to set up similar outdoor promotional installations and activities in major cities around the world including London, New York, and Tokyo. After completing their mandatory military service, the seven members of BTS will return with a new album after nearly four years of hiatus and embark on a large-scale world tour to promote the album, which contains 14 tracks. 2026-01-06 10:50:22 -
Samsung breaks ground on $475 million low-carbon ammonia plant in US SEOUL, January 06 (AJP) - Samsung's engineering unit Samsung E&A has begun construction of a low-carbon ammonia plant in the United States under the Wabash project. The firm said on Tuesday that it held a groundbreaking ceremony the previous day for the U.S. Wabash Low-Carbon Ammonia Project at the Hay-Adams hotel in Washington. The company signed an engineering, procurement and fabrication contract with Wabash Valley Resources in October valued at about 680 billion won ($475 million), and is targeting completion of the plant in 2029. Around 70 people attended the event, including South Korea’s Minister of Land, Infrastructure and Transport Kim Yun-deok, Samsung E&A President Namgung Hong, U.S. Deputy Secretary of Energy James P. Danly, and Simon Greenshields, chairman of Wabash Valley Resources. The facility will be built in the West Terre Haute area of Indiana and is designed to produce 500,000 tons of ammonia annually while capturing about 1.67 million tons of carbon dioxide each year. Samsung E&A described the project as a national-level initiative supported by a fund involving the U.S. Department of Energy and South Korea’s Ministry of Land, Infrastructure and Transport, as well as the Ministry of Climate, Energy and Environment. Samsung E&A said it plans to apply its ammonia-plant experience and advanced technologies, including digital transformation, artificial intelligence, automation and modular construction. It will also work closely with the project owner and technology licensor Honeywell UOP. 2026-01-06 10:46:41 -
[[CES 2026]] South Korea's Doosan Bobcat showcases voice-controlled machinery LAS VEGAS, January 06 (AJP) - South Korean construction equipment maker Doosan Bobcat unveiled next-generation equipment technologies at CES 2026 on Tuesday, highlighting artificial intelligence-based solutions aimed at improving productivity on jobsites. Vice Chairman Scott Park and Joel Honeyman, executive vice president for global innovation, presented AI-driven systems designed to simplify machine operation, reduce downtime and help equipment adapt to complex work environments. Among the new technologies is what the company described as the compact equipment industry’s first AI-based voice-control system, the “Bobcat Jobsite Companion.” The system allows operators to use voice commands to control more than 50 functions, including equipment settings, engine speed, lighting and radio controls. Doosan Bobcat said the system can also recommend optimal settings based on the task and attachments in use. Built on the company’s proprietary large language model, Jobsite Companion provides real-time responses and runs as an onboard AI model, allowing it to operate even when network connectivity is limited. “Jobsite Companion lowers the barrier for new operators while helping experienced operators work faster and more accurately,” Honeyman said. “It’s not just smart technology — it’s a smart experience that provides expert guidance from the driver’s seat.” The company also introduced “Service.AI,” an integrated support platform for dealers and technicians. The system combines repair manuals, warranty data, diagnostic guides and a database of previous service cases to help shorten repair times and reduce equipment downtime. It can be accessed through text or voice commands. In addition, Doosan Bobcat showcased a modular, fast-charging standard battery pack known as “BSUP,” a concept machine dubbed RogueX3, a collision warning and avoidance system, and next-generation operator displays. Doosan Bobcat said the technologies are designed as part of a single, integrated solution ecosystem, with many developed for future commercialization. The company plans to display them at the Doosan Group booth at CES. 2026-01-06 10:15:22
