Journalist
Lee Hugh
=
-
[[CES2026]] SK hynix to spotlight next-generation AI memory at CES 2026 SEOUL, January 06 (AJP) - SK hynix will showcase its next generation artificial-intelligence memory solutions at CES 2026 in Las Vegas, highlighting new high-bandwidth memory and low-power products as demand for AI infrastructure accelerates. The chipmaker will operate a customer-focused exhibition booth at the Venetian Expo from Jan. 6 to 9, shifting its emphasis from large-scale brand promotion to direct engagement with key clients, the company said. At the center of the display will be a 16-layer HBM4 product with 48 gigabytes of capacity, which SK hynix plans to unveil publicly for the first time. The company will also present its 12-layer HBM3E with 36GB, a product positioned to support near-term growth in AI servers. Beyond high-bandwidth memory, SK hynix will introduce SOCAMM2, a low-power memory module designed for AI servers, along with LPDDR6 for on-device AI applications. In NAND flash, the company plans to showcase a 321-layer 2-terabit QLC product aimed at high-capacity enterprise solid-state drives for data centers. The company will also operate an “AI System Demo Zone,” where it will demonstrate how future memory technologies — including customized HBM, processing-in-memory solutions and CXL-based memory modules — could integrate into next-generation AI systems. SK hynix said the exhibition reflects its strategy to deepen its role in AI infrastructure by expanding its portfolio of specialized memory products and strengthening collaboration with customers. “As innovation triggered by AI accelerates further, customers’ technical requirements are evolving rapidly,” said Justin Kim, president and head of AI Infrastructure at SK hynix. “We will meet those needs with differentiated memory solutions, and through close cooperation with customers, create new value that contributes to the advancement of the AI ecosystem.” 2026-01-06 14:05:32 -
[[CES 2026]] South Korea's Hyundai Wia unveils industrial robots, mobility parts LAS VEGAS, January 06 (AJP) - South Korea's Hyundai Wia has unveiled its manufacturing and logistics robot brand, H-Motion, at CES 2026, which opened here on Tuesday. H-Motion is Hyundai Wia’s first robotics brand and platform, covering autonomous mobile robots (AMRs), parking robots and collaborative robots designed for industrial use. The company said its AMRs and parking robots have already been supplied to Hyundai Motor manufacturing sites, including Hyundai Motor Group Metaplant America in Georgia, while the collaborative robot has completed mass-production testing at Hyundai Mobis. Hyundai Wia’s AMR can carry loads of up to 1.5 tons and supports both lidar-based autonomous navigation and guided driving using QR-code recognition. Its modular top unit, which loads cargo, can be replaced depending on the task, including lift modules that adjust height and turntable modules that change cargo direction. The parking robot operates as a paired system, with two robots moving beneath a vehicle, lifting its wheels and transporting it. The system can handle vehicles weighing up to 3.4 tons at speeds of up to 1.2 meters per second. The H-Motion lineup also includes collaborative robots designed to work alongside humans, automated guided vehicles that follow fixed routes using QR codes or magnets embedded in floors, and mobile platform robots that combine an AMR with a robotic arm. Alongside robotics, Hyundai Wia showcased future mobility components at CES, including a distributed heating, ventilation and air-conditioning (HVAC) system for next-generation vehicles. “Through CES, people will be able to see Hyundai Wia’s mobility and robotics capabilities in a comprehensive way,” CEO Kwon Oh-sung said. 2026-01-06 14:03:59 -
Ex-DP lawmaker's aide under police investigation for alleged bribery SEOUL, January 6 (AJP) - A former aide to ex-Democratic Party lawmaker Kang Sun-woo is being questioned over bribery allegations, police said on Tuesday. Questioning began around 7 a.m. at a police station in Mapo, western Seoul, with the aide suspected of receiving 100 million won on behalf of Kang in return for the DP's candidate nominations for the 2022 local elections and holding onto the cash. The investigation came after a relevant recording surfaced last week, which led Kim Byung-ki to resign from his post as the DP's floor leader after it appeared he had overlooked the alleged bribery. Kang said she had instructed her aide to return the money and confirmed that it was returned, but the aide claimed to know nothing about it. Meanwhile, Seoul city official Kim Kyung, who is believed to have handed the money to Kang, left for the U.S. shortly after the allegations were revealed. 2026-01-06 13:52:40 -
CES 2026: PlayStation on Honda self-driving EVs to be possible on U.S. roads by late 2026 LAS VEGAS, January 05 (AJP) - Imagine cruising down the highway while battling mythical beasts in God of War Ragnarök or listening to bespoke soundscapes by a Japanese music producer — all from inside a self-driving electric vehicle. That future rolled into CES 2026 on Monday as Sony Honda Mobility unveiled its first production-ready model, the Afeela 1, blurring the line between automobile and entertainment hub. The four-year-old joint venture between Sony Group and Honda Motor said customer deliveries in California will begin later this year, with prices starting at $89,000. "Mobility will evolve into an experience that understands every user," said Izumi Kawanishi, representative director and president of Sony Honda Mobility. "Being a car will no longer be about driving it. It will be about making the most of your time and space." Afeela 1 packs 40 sensors and computing power capable of up to 800 trillion operations per second. It launches with Level 2+ advanced driver assistance under "Afeela Intelligent Drive," with the company aiming for Level 4 autonomous capability over time. One signature feature is the integration of PlayStation Remote Play, enabling occupants to stream games from a PS5 directly to the vehicle's panoramic display. Eric Lempel, senior vice president at Sony Interactive Entertainment, also announced exclusive Afeela themes from Astro Bot, including custom wallpapers and sounds. "We want PlayStation to be the best place to play and to give players more ways than ever to access the games they love," Lempel said. "We're excited to bring Remote Play to Afeela." The vehicle's electrical architecture is powered by Qualcomm Technologies via the Snapdragon Digital Chassis. Nakul Duggal, executive vice president at Qualcomm, called the collaboration a shared bet on the intelligent car's future. Afeela's conversational AI — the Afeela Personal Agent — runs on Microsoft Azure OpenAI Service, enabling personalized, natural dialogue tuned to driver preferences and context. Adding an artistic layer, Grammy-nominated Japanese producer Tomoko Ida crafted a custom audio experience that turns electric acceleration into a musical instrument, blending Japanese and Western styles rather than mimicking traditional engine sounds. Yasuhide Mizuno, representative director, chairperson and CEO, confirmed that trial production began last fall at Honda's East Liberty Auto Plant in Ohio, with pre-production vehicles on display at CES. Sales are initially limited to California, with delivery hubs opening this spring in Torrance and Fremont. Demo drives for early reservation holders are planned later this year. Expansion to Arizona follows in 2027, while Japan deliveries are slated for the first half of the same year. Sony Honda Mobility also debuted the Afeela Prototype 2026, a compact SUV-style concept expected to reach U.S. roads as early as 2028. 2026-01-06 13:30:08 -
Nvidia AI chip for self-driving to test on Mercedes Benz in US roads Q1 LAS VEGAS, January 06 (AJP) -Nvidia unveiled a next-generation “reasoning AI” for autonomous driving at CES 2026 on Monday, signaling a deeper push into the self-driving vehicle market with real-world deployments planned as early as the first quarter of this year. At a press conference at the Fontainebleau Live Theater in Las Vegas, Nvidia CEO Jensen Huang introduced the technology, dubbed Alpamayo, describing it as “the world’s first self-driving AI that thinks and reasons.” “The ChatGPT era of physical AI has arrived,” Huang said onstage. “Machines have begun to understand the real world directly and to reason and act on their own.” Huang said Alpamayo is designed to address autonomous driving’s so-called “long-tail” problem — rare, complex scenarios that are difficult to anticipate through conventional data collection alone. Unlike traditional systems that react primarily to sensor inputs, Alpamayo reasons step by step, Huang said, and can explain why it chose a particular action. “Alpamayo not only takes sensor input to control steering, braking and acceleration, it also reasons about what action to take and tells you why,” he said. “It can respond safely even in complex long-tail scenarios.” The model is a vision-language-action system built on Nvidia’s physical AI platform, Cosmos, which the company describes as a world foundation model. Nvidia said Alpamayo was trained using human driving demonstrations and trillions of miles of AI-generated data. During the presentation, Huang showcased a live demonstration of a vehicle equipped with Alpamayo navigating downtown San Francisco, verbally explaining its decision-making process in real time. “The long tail of driving can’t be solved by collecting every scenario in the real world,” Huang said. “But it can be solved by breaking it down into smaller, normal situations and reasoning through them.” A video shown onstage featured an AI-powered Mercedes-Benz vehicle driving through San Francisco traffic while a passenger sat behind the steering wheel with their hands resting in their lap. “It drives so naturally because it learned directly from human demonstrators,” Huang said. “In every single scenario, it reasons about what it’s going to do and why.” Nvidia said it plans to release Alpamayo as open source, with the underlying code made available on the machine-learning platform Hugging Face, allowing autonomous-vehicle researchers to access and retrain the model. Several companies and research groups have already lined up to use Alpamayo for Level 4 autonomous driving development, including Lucid Motors, Jaguar Land Rover, Uber and Berkeley DeepDrive. “Robotaxis will be the first to benefit,” Huang said, adding that Mercedes-Benz vehicles equipped with Alpamayo are expected to begin operating on U.S. roads in the first quarter. Nvidia also said it is planning a robotaxi pilot service in 2027 with partners including Uber. The company declined to name additional partners or specify locations for the rollout. “Our vision is that someday, every single car, every single truck, will be autonomous,” Huang said. 2026-01-06 13:29:17 -
HOT STOCK: Hyundai Motor jumps over 4% early Tuesday on robotics timeline SEOUL, January 6 (AJP) - Shares of Hyundai Motor surged in early trading Tuesday, as investors cheered the company's newly unveiled AI and robotics roadmap at CES 2026, fueling optimism that growth engines beyond its core automotive business are taking clearer shape. As of 9:14 a.m., Hyundai Motor was trading at 317,500 won(US$235), up 13,000 won, or 4.27 percent, from the previous close of 304,500 won. The stock briefly climbed as high as 330,000 won before paring gains amid broad profit-taking. By 10:24 a.m., shares eased to 307,500 won, tracking a broader market pause after recent gains. The KOSPI slipped 0.4 percent to 4,438.4 as investors locked in profits following the index's year-end rally. The stock opened strong following Hyundai Motor Group's overnight presentation at CES in Las Vegas, where the company laid out a detailed timeline for AI-driven robotics and so-called "physical AI" technologies, underscoring its push to integrate artificial intelligence into manufacturing, mobility and industrial automation. Chung Euisun, chairman of Hyundai Motor Group, has recently described artificial intelligence as a "winnable game" for the group, citing strengths in physical manufacturing, mobility platforms and real-world data. According to industry sources, Chung has emphasized that Hyundai's ability to integrate AI across vehicles, robots and production systems gives it a structural edge over peers. At CES, the group showcased its humanoid robot Atlas, outlining plans to begin mass production in 2028 and deploy the robots at scale on U.S. assembly lines by 2030 — a timeline that analysts said helped crystallize Hyundai's commercialization strategy. Momentum was further bolstered after Hyundai's autonomous mobility robot platform MobED won the Best of Innovation Award in the robotics category at CES 2026. The accolade marked Hyundai’s first top-tier innovation award at the exhibition and was widely viewed as external validation of its robotics technology and commercialization potential. Hyundai said it plans to leverage its software-defined factory (SDF) framework to test and validate robotics technologies in live manufacturing environments before scaling them into broader industrial, commercial and daily-life applications. Analysts said the approach signals a shift from concept-driven showcases toward practical deployment and ecosystem building — improving visibility on future monetization from AI and robotics initiatives. Hyundai Motor reported on Monday that it sold 4.14 million vehicles in 2025, down 0.1 percent from 2024, with domestic sales rising 1.1 percent while overseas sales fell 0.3 percent. 2026-01-06 11:27:50 -
KOSPI takes breather while Asian markets stay broadly strong SEOUL, January 06 (AJP) - South Korean stocks took a pause on Tuesday after a relentless rally since year-end, while broader Asian markets largely maintained upward momentum following overnight gains on Wall Street, despite lingering geopolitical jitters tied to Venezuela. In Seoul, the benchmark KOSPI edged down 0.1 percent to 4,451.94 as of 11:00 a.m., as institutional investors locked in profits after the index recently scaled fresh record highs. The tech-heavy KOSDAQ also slipped 0.3 percent to 954.93. Overall market moves were measured, with investors rotating toward market leaders amid recent volatility. Gains in select industrial, defense and shipbuilding stocks helped offset profit-taking in heavyweight chipmakers. Samsung Electronics fell 2.1 percent to 135,200 won ($93.4), while SK hynix slid 1.7 percent to 684,500 won in early trade, reflecting profit-taking after a strong rally. Despite the pullback, sentiment toward chipmakers remained constructive, with analysts pointing to steady AI-driven demand. Battery and industrial shares traded mixed. LG Energy Solution, the country’s third-largest company by market capitalization, rose 1.1 percent to 375,500 won, while HD Hyundai Heavy Industries advanced 2.1 percent to 524,000 won. Hanwha Aerospace, however, slipped 1.78 percent to 994,000 won amid profit-taking after recent gains. By contrast, entertainment stocks underperformed the broader market, weighed down by lingering uncertainty over overseas content demand and regulatory risks. HYBE fell 1.9 percent to 333,500 won, while JYP Entertainment slid 1.2 percent to 71,800 won. SM Entertainment dipped 0.4 percent to 118,500 won, and YG Entertainment eased 0.2 percent to 65,000 won. In the currency market, the Korean won traded little changed against the dollar, hovering around 1,447.6 won per dollar, as investors balanced improved equity sentiment against external risk factors. In Tokyo, Japanese stocks extended gains in early trade. The Nikkei 225 rose 0.7 percent to 52,196.3, supported by advances in autos, financials and technology-related shares. Among major heavyweights, Toyota Motor climbed 2.1 percent to 3,472 yen ($22.2), while Mitsubishi UFJ Financial Group gained 2.7 percent to 2,613.5 yen on strength in financial shares. Elsewhere in Asia, market moves were more restrained. Mainland China’s Shanghai Composite added 1.4 percent to 4,023.4, while Hong Kong’s Hang Seng Index rose 0.6 percent to 26,507.3. 2026-01-06 11:26:46 -
CES 2026: Lego no longer just child's play as it goes high-tech and AI LAS VEGAS, January 05 (AJP) - Lego bricks don’t just stack anymore — they listen, light up and talk back. At CES 2026, the Lego Group unveiled AI-embedded bricks designed to bring sound, motion and real-time feedback into physical play, blurring the line between classic building toys and interactive games. At a media showcase on Monday at the world’s largest technology trade show, the Danish toymaker introduced what it calls “Smart Play,” a tech-hybrid Lego platform centered on a standard 2×4 brick embedded with sensors, LED lights and a tiny speaker. The move marks Lego’s boldest step yet into connected play as it seeks to compete on equal footing with digital-first entertainment. The Smart Brick, identical in size to a traditional Lego piece, houses a 4.1-millimeter ASIC chip — smaller than a Lego stud — that runs what the company calls the “Play Engine.” The chip allows the brick to sense motion, orientation, magnetic fields and the proximity of other Smart Bricks, enabling pieces to interact with one another in real time. “We wanted to leverage technical innovation and bring it into physical play,” said Julia Goldin, Lego Group’s chief product and marketing officer. “Kids have unlimited creativity. They have unlimited imagination.” Beyond the core brick, the Smart Play system includes Smart Minifigures and Smart Tags, each equipped with digital IDs readable through near-field magnetic communication. The bricks also feature an accelerometer, integrated copper coils and a miniature speaker that produces sounds triggered by live actions rather than pre-recorded clips. During a live demonstration, two Smart Bricks changed colors depending on how far apart they were and whether they were facing each other. A toy car revved its engine when pushed forward and screeched when sharply tilted. A Lego duck quacked while “swimming” and croaked softly when laid down to sleep. A plane roared through aerial maneuvers, responding to every twist and turn. Minifigures further altered the play experience. A civilian figure screamed when repeatedly rammed by a toy car, while a pilot figurine let out disgruntled sounds if a plane was flipped upside down. Lego said the system operates on a proprietary wireless layer called BrickNet, built on Bluetooth technology using what it calls “Neighbor Position Measurement.” Crucially, the bricks communicate directly with one another without the need for apps, internet connections or external controllers. “The way to think of this is like a tiny distributed console, but for physical play,” said Tom Donaldson, senior vice president at the Lego Group, who led the technical demonstration. “One Smart Brick can be reused to unlock a huge range of different play experiences across potentially thousands of models.” Battery performance was another focus. Lego said the Smart Bricks are designed to function even after years of inactivity, with multiple pieces capable of being charged wirelessly on a shared charging pad. In a surprise appearance, executives from Lucasfilm joined Lego on stage alongside R2-D2 and C-3PO to preview how the Star Wars franchise will anchor the first Smart Play lineup. Lego’s partnership with Lucasfilm began in 1998, when its minifigure catalog comprised just 27 characters. Today, the lineup spans more than 1,500. “The Lego brand is all about building creativity and inventing your own adventure, and this new Lego Smart innovation takes that to the next level,” said Dave Filoni, Lucasfilm’s chief creative officer. “We hope it’ll inspire a whole new generation of storytellers.” Lego plans to launch the Smart Play system with three Star Wars sets in March, with starting prices around $90. The company, which celebrated its 70th anniversary last year, said Smart Play reflects its effort to bring advanced technology into physical play while preserving the open-ended creativity and simplicity that have long defined the Lego brand. 2026-01-06 11:25:21 -
[[CES 2026]] NVIDIA CEO unveils Vera Rubin, full-stack AI platform LAS VEGAS, January 05 (AJP) -Vera Rubin - the next bar-raising AI chip platform from NVIDIA for release in the second half - is designed not as a single product, but as a full-stack platform spanning AI training, inference and deployment across large-scale systems, according to its creator Monday (local time). Unveiling the architecture, Jensen Huang, NVIDIA’s founder and chief executive, said Vera Rubin represents a shift toward platform-level computing aimed at powering end-to-end AI workloads, from model development to real-world deployment at scale. Clad in his signature leather jacket, Hwang introduced the newest product at a press conference held at the BlueLive Theater inside the Fontainebleau Hotel in Las Vegas, ahead of CES 2026 opening. Introducing the system as "One Platform for Every AI," Huang positioned Vera Rubin as a unified foundation designed to support AI training, inference and deployment at scale, rather than as a single chip or product line. Opening his keynote, Huang placed the launch within a broader cycle of technological change. "Every 10 to 15 years, the computer industry resets," he said, pointing to past shifts from PCs to the internet, and from cloud computing to mobile platforms. This time, he argued, the transition is more disruptive. "Two platform shifts are happening at once. Applications are being built on AI, and how we build software has fundamentally changed." Huang stressed that AI is no longer simply another layer in computing. Instead, it is reshaping the entire stack. "You no longer program software. You train it," he said. "You do not run it on CPUs. You run it on GPUs." Unlike traditional applications that rely on precompiled logic, AI systems now generate outputs dynamically, producing tokens and pixels in real time based on context and reasoning. That shift, Huang said, is driving an unprecedented surge in demand for computing power. As models grow larger and reasoning becomes central to performance, both training and inference workloads have expanded sharply. "Models are growing by an order of magnitude every year," he said, noting that test-time scaling has turned inference into a thinking process rather than a single response. "All of this is a computing problem. The faster you compute, the faster you reach the next frontier." Vera Rubin is NVIDIA’s response to that challenge. Named after astronomer Vera Florence Cooper Rubin, the platform is built through what NVIDIA calls extreme codesign, integrating six newly developed chips — including the Vera CPU and Rubin GPU — into a single system. According to NVIDIA, this approach enables large mixture-of-experts models to be trained with roughly 4x fewer GPUs and reduces inference token costs by up to 10x compared with the Blackwell platform. Huang highlighted the platform’s rack-scale architecture as a key departure from earlier designs. The Vera Rubin NVL72 system connects 72 GPUs using sixth-generation NVLink technology, creating internal bandwidth at a scale rarely seen in data centers. "The rack provides more bandwidth than the entire internet," Huang said, underscoring how data movement has become a limiting factor for modern AI systems. Memory was another issue Huang addressed directly. As AI models move toward multi-step reasoning and longer interactions, the amount of context they must retain has outgrown on-chip memory. Vera Rubin introduces a new inference context memory platform powered by BlueField-4 processors, allowing large volumes of key-value cache data to be stored and shared locally. Huang said the goal is to reduce network congestion while maintaining consistent inference performance at scale. Energy efficiency and system reliability were also emphasized. Despite significantly higher compute density, Huang said Vera Rubin systems are cooled using hot water at about 45 degrees Celsius, eliminating the need for energy-intensive chillers. The platform also introduces rack-scale confidential computing, encrypting data across CPUs, GPUs and interconnects to protect proprietary models during training and inference. Throughout the keynote, Huang repeatedly framed NVIDIA’s role as extending beyond chip design. "AI is a full stack," he said. "We are reinventing everything, from chips to infrastructure to models to applications." He added that this approach is intended to support what he described as the next phase of AI deployment, built around large-scale AI factories operated by cloud providers, enterprises and AI labs. NVIDIA said Rubin-based products will be rolled out through partners beginning in the second half of 2026, with cloud providers, AI labs and enterprise customers expected to be among the early adopters. 2026-01-06 11:23:39 -
[[CES 2026]] NVIDIA CEO Jensen Huang delivers keynote speech Las Vegas, January 5 (AJP) - NVIDIA CEO Jensen Huang delivered the keynote address during an NVIDIA press conference at the Fontainebleau Hotel's BlueLive Theater in Las Vegas on Monday (local time). 2026-01-06 11:09:34
