NewzIntel.com

    • Checkout Page
    • Contact Us
    • Default Redirect Page
    • Frontpage
    • Home-2
    • Home-3
    • Lost Password
    • Member Login
    • Member LogOut
    • Member TOS Page
    • My Account
    • NewzIntel Alert Control-Panel
    • NewzIntel Latest Reports
    • Post Views Counter
    • Privacy Policy
    • Public Individual Page
    • Register
    • Subscription Plan
    • Thank You Page

Category: Artificial Intelligence

  • MIL-OSI USA: DBEDT NEWS RELEASE: DIGITAL EQUITY INNOVATION AWARDS HONORS THOSE HELPING TO CLOSE THE DIGITAL DIVIDE IN HAWAI‘I

    Source: US State of Hawaii

    DBEDT NEWS RELEASE: DIGITAL EQUITY INNOVATION AWARDS HONORS THOSE HELPING TO CLOSE THE DIGITAL DIVIDE IN HAWAI‘I

    Posted on Oct 9, 2024 in Latest Department News, Newsroom

    DEPARTMENT OF BUSINESS, ECONOMIC DEVELOPMENT AND TOURISM

     

     JOSH GREEN, M.D.

    GOVERNOR

     

    SYLVIA LUKE

    LIEUTENANT GOVERNOR

     

    JAMES KUNANE TOKIOKA

    DIRECTOR

     

    CHUNG I. CHANG

    STRATEGIC BROADBAND COORDINATOR

     

     

    FOR IMMEDIATE RELEASE

    October 9, 2024

     

    DIGITAL EQUITY INNOVATION AWARDS HONORS THOSE HELPING TO CLOSE THE DIGITAL DIVIDE IN HAWAI‘I

     

    First-ever awards held during Digital Inclusion Week

     

    In recognizing the work of individuals and organizations who help provide internet access and close the digital divide across the state of Hawai‘i, 18 recipients of the first-ever Digital Equity Innovation Awards (DEIA) were honored today.

     

    Conducted in conjunction with National Digital Inclusion Week (October 7-11), the awards ceremony this morning recognized pioneers, future innovators, dedicated advocates, impactful organizations and data-driven leaders making significant strides in digital equity. This includes providing others with access to technology from broadband connectivity to devices, as well as teaching the necessary digital skills that are beneficial in employment, education, healthcare and other important facets of everyday life.

     

    The digital awards were organized by the state Department of Business, Economic Development and Tourism (DBEDT) Hawai‘i Broadband and Digital Equity Office (HBDEO), the Broadband Hui and Pacific International Center for High Technology Research (PICHTR), in partnership with the four county governments and the islands’ nonprofit community access television providers, ʻŌlelo Community Media, Hōʻike Kaua‘i Community Television, Akakū Maui Community Media and Nā Leo TV. The awards recognized those in each of the four counties in the following categories:

     

    • Digital Equity Pioneer Award: Those making outstanding contributions to closing the digital divide in each of Hawai‘i’s counties through innovative access and skills training.
    • Future Innovators Award: Student teams driving digital inclusion within their schools and communities with creative solutions and leadership.
    • Digital Equity Luminary Award: Individuals championing digital equity through sustained advocacy and impactful leadership.
    • Community Impact Award: Organizations with measurable success in fostering digital inclusion and reducing disparities.
    • Digital Equity Beacon Award: Awarding those who effectively use data to tell stories, measure progress, and drive decision-making.

     

    Hawai‘i Lt. Governor Sylvia Luke, who last year announced the launch of the state’s “Connect Kākou” initiative to expand broadband service statewide through anticipated federal funding, praised the accomplishments of the DEIA winners.

     

    “Achieving accessible and affordable high-quality internet for all of Hawaiʻi is the commitment of Connect Kākou. Making this a reality will require a collective effort—from government and nonprofits to businesses, students, educators, and digital equity leaders,” Lt. Gov. Luke said. “Mahalo to the dedicated community champions who are paving the way to create a future that keeps us all connected for generations to come.”

     

    The awardees are listed below and grouped by county:

     

    City and County of Honolulu

    Dotty Kelly-Paddock, Hui O Hau‘ula (Community Impact Award)

    Dan Smith, Hawai‘i Broadband Hui (Beacon Award)

    Stacey Aldrich, Hawai‘i State Librarian (Luminary Award)

    Wendy Dakroub and Sasha Kamahele, Tech Savvy Teens (Future Innovators Award)

    Jill Takasaki Canfield, Hawai‘i Literacy (Pioneer Award)

     

    County of Hawai‘i

    Ron and Doreen Kodani, Pi‘ihonua Hawaiian Homestead Community Association (Luminary Award)

    Brad Kaleo Bennett, ‘Auamo Collaborative (Beacon Award)

    Pono Kekela, Native Hawaiian Chamber of Commerce (Pioneer Award)

    Paola Vidulich, SPACE (Future Innovators Award)

     

    County of Kaua‘i

    David Braman, Amalia Abigania and Leah Aiwohi, Kaua‘i High School (Future Innovators Award)

    Pete Simon, Kuleana.work (Pioneer Award)

    James Thesken, Kaua‘i Technology Group (Beacon Award)

    Jackie Kaina, Kaua‘i Economic Development Board (Luminary Award)

    Ken Dickinson, Kūpuna Connections (Community Impact Award)

     

    County of Maui

    Bill Sides, Hāna Business Council East Maui Broadband (Luminary Award)

    Marc Sanders, Hāna Business Council Broadband Committee (Pioneer Award)

    Ka‘ala Souza, Māpunawai Inc. (Luminary Award)

    Michael Shiffler, Red Lightning (Community Impact Award) 

     

    A video of the DEIA awards program can be viewed at this link: https://youtu.be/h9adTnDXZcc

     

    The DEIA awards program will also be broadcast at 10 a.m. today on the Hōʻike Kaua‘i Community Television, Akakū Maui Community Media and Nā Leo TV public access channels on the neighbor islands, and tonight at 7 p.m. on O‘ahu on ʻŌlelo Community Media.

     

     

    About Hawai‘i Broadband and Digital Equity Office (HBDEO):

    HBDEO was established within the state of Hawai‘i Department of Business, Economic

    Development and Tourism with a mission to support and coordinate statewide deployment of high-speed internet access (broadband) and to achieve the goals of digital equity and adoption for all residents of Hawai‘i. HBDEO’s functions include the coordination, implementation, promotion, funding and managing of programs that ensure the equitable distribution of digital technologies and provide pathways to maximize Hawai‘i’s competitiveness in the digital economy.

     

    About Department of Business, Economic Development and Tourism (DBEDT):

    DBEDT is Hawai‘i’s resource center for economic and statistical data, business development opportunities, energy and conservation information, as well as foreign trade advantages. DBEDT’s mission is to achieve a Hawai‘i economy that embraces innovation and is globally competitive, dynamic and productive, providing opportunities for all Hawai‘i’s citizens. Through its attached agencies, the department fosters planned community development, creates affordable workforce housing units in high-quality living environments and promotes innovation sector job growth.

     

     

    # # #

     

     

    Media Contact:

     

    Laci Goshi

    Department of Business, Economic Development and Tourism

    808-518-5480

    [email protected]

    MIL OSI USA News –

    January 23, 2025
  • MIL-OSI Economics: Bolstering local journalism to strengthen democracy

    Source: Microsoft

    Headline: Bolstering local journalism to strengthen democracy

    A free press is essential to healthy democracy, and local journalism is a critical component of a free press. Microsoft’s Democracy Forward initiative works to preserve, protect, and advance the fundamentals of democracy by safeguarding open and secure democratic processes, promoting a healthy information ecosystem, and advocating for corporate civic responsibility.

    Four years ago, we launched a journalism initiative to explore ways in which we could help address the growing crisis facing independent local news organizations around the world. Two years ago, our Vice Chair and President Brad Smith and USAID Administrator Samantha Power announced our plan to partner with Internews to build a Media Viability Accelerator (MVA). We were thrilled to officially launch this tool during a panel event at the UN General Assembly last month.

    Bolstering Independent Journalism through the Media Viability Accelerator

    The Media Viability Accelerator is a free web analytics platform built by Internews and Microsoft Azure. Funded by USAID and Microsoft’s Democracy Forward initiative, the MVA aims to strengthen independent journalism by helping participating organizations achieve financial sustainability. Using Azure AI, the MVA harnesses the power of big data and machine learning to provide performance insights while ensuring that participants retain control over their own data. Through the MVA, media outlets can access a multilingual tool that visualizes performance data and receive actionable insights to improve performance.

    Graphic of how the Media Viability Accelerator (MVA) functions.

    More than 250 media outlets and over 500 journalists used the platform during the MVA’s initial pilot phase. Our goal is to empower over 1,000 more media outlets and thousands more journalists over the next two years, reaching audiences of hundreds of millions of people. Strengthening local journalism helps strengthen democracies around the world by ensuring that communities and voters have accurate, credible information about what’s happening around them, including and especially elections.

    Strengthening journalism globally can also help turn the tide on rising authoritarianism. One of the guests on the panel we cohosted to launch the MVA was Juan Holmann, the publisher of Nicaragua’s longest-running newspaper, La Prensa. Holmann, who spent a year and a half in one of Nicaragua’s most notorious prisons, later said of his experience:

    “I left jail with a stronger conviction that I have to continue fighting for freedom of expression. The most important right is the right to live, to be born, and to be. And the second most important is the right to free expression. The first right is useless if the second is taken away from us. Freedom of expression is the greatest because it is what makes us what we are. Freedom of expression is the right to be educated, the right to learn, to know, and to discern.”

    We’re grateful to have La Prensa as a participant in the MVA, and we’re grateful for the tremendous work Internews has put into building and running this platform. We look forward to supporting its continued growth in the years to come.

    Strengthening Democracy through Partnerships with News Organizations

    As part of our efforts to strengthen democracy around the world, we have announced projects with a number of organizations designed to help journalists and newsrooms deploy AI responsibly in newsgathering, as well as bolster business practices to help build sustainable newsrooms. These ongoing partnerships include:

    • The Institute for Nonprofit News is leveraging AI to curate stories from the Rural News Network and connect rural residents with the stories most relevant to them via SMS messaging. Up to 30 newsrooms are also receiving stipends to produce and distribute voter information guides.
    • The Craig Newmark Graduate School of Journalism at CUNY brought 25 experienced journalists to a tuition-free program to explore ways to incorporate generative AI into their work and newsrooms in a three-month hybrid and highly interactive program. The AI Journalism Lab has added two new upcoming cohorts, one focused on adoption and another focused on leadership.
    • The Online News Association launched programming to support journalists and newsroom leaders as they navigated the evolving AI ecosystem. ONA’s AI in Journalism Initiative offered a menu of opportunities addressing what is possible across the newsroom through AI and offered workshops to experiment with tools and learn about best practices. More than 2,000 journalists have been reached through in-person and virtual programming this year.
    • The GroundTruth Project, which sends local journalists into newsrooms around the world through its Report for America and Report for the World programs, added an AI track of work for its corps members through the AI in Local News initiative to explore tool adoption. The project helped local newsrooms work together to explore use cases for AI in newsgathering.
    • Semafor harnessed AI tools to assist journalists in their research and source discovery with Semafor Signals, which helped journalists provide a diverse array of credible local, national, and global sources to their audience. They also created an elections display to show connections between different countries in a massive global election year.

    As the media landscape continues to evolve in response to new technology, we are doubling down on our efforts to provide journalists with the tools they need to deliver timely, accurate information to their communities. In doing so, we can help ensure that the “fourth pillar” of democracy remains robust and resilient.

    We expect to have updated impact data on the above partnerships soon and will update this post once this information is available. News outlets or other organizations interested in joining the Media Viability Accelerator can visit http://www.mva.net to learn more.

    MIL OSI Economics –

    January 23, 2025
  • MIL-OSI United Kingdom: Energy experts appointed to deliver clean power 2030 mission

    Source: United Kingdom – Executive Government & Departments

    Government appoints leading industry and academic experts to the Clean Power 2030 Advisory Commission to help accelerate UK’s mission to decarbonise the electricity grid.

    • Eight energy and nature experts have been appointed to accelerate UK’s mission for clean power by 2030
    • the 8 commissioners have almost 200 years’ worth of experience across energy policy, environment, industry and academia
    • experts will form new Advisory Commission for Mission Control, with Energy Secretary Ed Miliband attending their first meeting today

    Eight leading figures from across industry and academia have been appointed to help accelerate the government’s world leading target to deliver clean power by 2030.

    The Clean Power 2030 Advisory Commission will support Chris Stark, Head of Mission Control, in developing a Clean Power 2030 system – providing expertise to deliver the Clean Power 2030 Action Plan, expected later this year.

    The Action Plan will set out the path to decarbonise the electricity grid, helping protect billpayers from volatile gas prices and strengthening Britain’s energy security.

    The 8 commissioners come from across industry and academia with a wealth of expertise and experience to advise on specific aspects of developing a clean power system, including planning, infrastructure, nature, and supply chains.

    The full list of commissioners include:

    • Nick Winser: Over 30 years’ experience in the energy sector, including having been CEO of National Grid across UK and Europe, and President of the European Network of Transmission System Operators for Electricity.

    • Tim Pick: Over 25 years in the energy sector and is a passionate advocate for offshore wind having been the UK’s first Offshore Wind Champion.

    • Juliet Davenport: Founder of the Good Energy company and President of the Energy Institute. Juliet has been an innovator for over 20 years, working on ideas to fight climate change and transform the energy sector for the better.

    • Robert Gross: As well as being Director of the UK Energy Research Centre since 2020, Rob is Professor of Energy Policy and Technology at Imperial College.

    • Craig Bennett: Chief Executive of The Wildlife Trust and former CEO of Friends of the Earth, Craig has 20 years’ experience of designing and contributing to executive education and leadership programmes at numerous universities and business schools.

    • Jo Coleman: 35 years’ experience in the energy industry. Board member of several energy organisations, with a background in engineering and major project delivery in the oil and gas sector.

    • Lucy Yu: CEO and founder at Centre for Net Zero, Octopus Energy Group’s not-for-profit AI and data-driven research institute, which was set up to advance tech-driven energy systems that benefit humanity.

    • Dr Simon Harrison: A leading voice in public policy around the ways engineering can help with the energy transition and decarbonisation. Was elected a Fellow of the Royal Academy of Engineering in 2023 – the highest accolade in the profession.

    The Energy Secretary chaired the first Advisory Commission meeting this afternoon, emphasising the importance of the new group for removing barriers and accelerating the energy system towards clean power by 2030.

    Energy Secretary Ed Miliband said:

    The best way to take back control of our energy security and create highly skilled jobs is to speed up the rollout of renewables and transition towards clean homegrown power.

    The Clean Power 2030 Advisory Commission, benefiting from decades of experience across industry and academia, under Chris Stark’s leadership, will have a laser-like focus on delivering our mission for clean power by 2030.

    Head of Mission Control Chris Stark said:

    The Clean Power by 2030 is a statement of our ambition. This mission will unlock good jobs and protect the consumer, and it is key to our energy security.

    We will work closely with our partners in industry to deliver this mission at pace – these are 8 leading figures in their field to drive that partnership.

    I’m looking forward to working with all 8 commissioners to unblock barriers, spot the opportunities, and deliver a clean power system by 2030.

    Share this page

    The following links open in a new tab

    • Share on Facebook (opens in new tab)
    • Share on Twitter (opens in new tab)

    Updates to this page

    Published 10 October 2024

    MIL OSI United Kingdom –

    January 23, 2025
  • MIL-OSI Economics: Microsoft expands AI capabilities to shape a healthier future

    Source: Microsoft

    Headline: Microsoft expands AI capabilities to shape a healthier future

    REDMOND, Wash. — Oct. 10, 2024 — On Thursday, Microsoft Corp. is unveiling several Microsoft Cloud for Healthcare innovations that connect care experiences, enhance team collaboration, empower healthcare workers, and unlock clinical and operational insights.

    Through new healthcare AI models in Azure AI Studio, capabilities for healthcare data solutions in Microsoft Fabric, the healthcare agent service in Copilot Studio, and an AI-driven nursing workflow solution, Microsoft Cloud for Healthcare is supporting healthcare organizations on every step of their journey toward shaping a healthier future.

    “We are at an inflection point where AI breakthroughs are fundamentally changing the way we work and live,” said Joe Petro, corporate vice president, Healthcare and Life Sciences Solutions and Platforms at Microsoft. “Across the broader healthcare and life sciences industry, these advancements are dramatically enhancing patient care and also rekindling the joy of practicing medicine for clinicians. Microsoft’s AI-powered solutions are helping lead these efforts by streamlining workflows, improving data integration, and utilizing AI to deliver better outcomes for healthcare professionals, researchers and scientists, payors, providers, medtech developers, and ultimately the patients they all serve.”

    Expanding the reach of AI beyond text: healthcare AI models in Azure AI Studio

    Microsoft is announcing the launch of healthcare AI models, a collection of cutting-edge multimodal medical imaging foundation models available in the Azure AI model catalog. Developed in collaboration with partners like Providence and Paige.ai, these models enable healthcare organizations to integrate and analyze diverse data types — ranging from medical imaging to genomics and clinical records. By using these advanced models as a foundation, healthcare organizations can rapidly build, fine-tune and deploy AI solutions tailored to their specific needs, all while minimizing the extensive compute and data requirements typically associated with building multimodal models from scratch.

    “The development of foundational AI models in pathology and medical imaging is expected to drive significant advancements in cancer research and diagnostics,” said Carlo Bifulco, MD, chief medical officer of Providence Genomics and a co-author of the Prov-GigaPath study. “These models can complement human expertise by providing insights beyond traditional visual interpretation and, as we move toward a more integrated, multimodal approach, will reshape the future of medicine.”

    Harnessing the power of healthcare data with Microsoft Fabric

    Historically, healthcare data has been difficult to access due to its unstructured nature and the limitations of existing data management systems. These challenges have limited organizations’ ability to gain a comprehensive view of patient experiences and access valuable insights.

    With the general availability of healthcare data solutions in Microsoft Fabric, healthcare organizations can overcome these barriers by reshaping how users access, manage and act on data with a single, unified AI-powered platform. Additionally, healthcare security application templates for Microsoft Purview, an innovative suite of features designed to help govern healthcare data, are available in public preview. We’re also launching new capabilities in public preview within healthcare data solutions in Microsoft Fabric including:

    • Conversational data integration: Send conversational data, such as patient conversations, from DAX Copilot to the Fabric platform. By sending DAX Copilot audio files, transcripts and draft clinical notes to Fabric, customers and partners can leverage various native tools in Azure and Fabric to analyze this data and/or combine it with other data to generate comprehensive insights.
    • Social determinants of health (SDOH) public dataset transformation: Ingest, persist, harmonize and consume SDOH national and international public datasets to enable healthcare organizations to identify risks and health-related social needs to help create equitable healthcare for all patients and communities.
    • Centers for Medicare & Medicaid Services (CMS) claim and claim line feed (CCLF) data ingestion: Streamline the ingestion of claims data and harmonize with clinical, imaging and SDOH data to unlock actionable insights on patients and populations.
    • Care management analytics: Leverage unified healthcare data and care management analytical templates to enhance patient care by identifying high-risk individuals, optimizing treatment plans and improving care coordination.
    • Data discovery and cohorting: Utilize an integrated workflow that allows healthcare organizations to create, manage, analyze and share patient cohorts.

    Building a safe and responsible healthcare agent

    Healthcare organizations face numerous challenges, including workforce shortages, rising costs and increasing patient care demands. Generative AI offers a potential solution to these challenges by automating administrative tasks, analyzing vast amounts of data for actionable insights and assisting healthcare professionals in decision-making.

    To address this, Microsoft is announcing the public preview of healthcare agent service in Copilot Studio to build Copilot agents for appointment scheduling, clinical trial matching, patient triaging and more. Organizations can leverage the healthcare agent service to help create connected patient experiences, improve clinical workflows, and empower healthcare professionals while helping organizations meet industry expectations with Microsoft Copilot Studio. Early adopters, like Cleveland Clinic, which provided feedback to help optimize the solution for a healthcare setting, are already using these innovations to enhance patient experiences and improve operational efficiency.

    Enhancing nursing workflows with AI: nursing early outcomes

    With the World Health Organization (WHO)1 predicting a shortage of 4.5 million nurses by 2030, the urgency to deliver technology to support the nursing profession is felt more than ever.

    Last month at Epic’s UGM, we announced the next focus area for our collaboration in Epic Workshop. Today, we’re sharing more about how we’re actively collaborating with several leading healthcare organizations — including Advocate Health, Baptist Health of Northeast Florida, Duke Health, Intermountain Health Saint Joseph Hospital, Mercy, Northwestern Medicine, Stanford Health Care, and Tampa General Hospital — to build an AI solution using ambient technology that addresses nursing documentation by drafting flowsheets for review, allowing nurses to focus less on paperwork and more on their patients. This innovation expands on the company’s long-standing strategic collaboration and joint development initiatives with Epic.

    “AI is transforming nursing workflows by streamlining administrative tasks, allowing nurses to focus more on patient care,” said Corey Miller, vice president of R&D at Epic. “Together with Microsoft, we’re using AI-powered ambient voice technology to populate patient assessments. Nurses using the tool are already sharing positive feedback on how it enhances personalized patient interactions.”

    “For nurses, the integration of AI-driven solutions into our workflows is a game changer,” said Terry McDonnell, DNP, ANCP-BC, senior vice president and chief nurse executive, Duke University Health System, vice dean for Clinical Affairs, Duke University School of Nursing, Duke Health. “It allows us to focus more on patient care rather than the administrative burden of documentation. By automating tedious tasks, Microsoft’s ambient AI solution helps alleviate burnout and gives us more time to connect with our patients at the bedside, where we truly make a difference.”

    Empowering responsible AI practices across healthcare

    In line with Microsoft’s dedication to responsible AI, these new solutions adhere to the company’s AI principles established in 2018 to help guide AI development and use. Microsoft remains committed to developing responsible AI by design, ensuring that these technologies positively impact both the healthcare ecosystem and broader society. In practice this means properly building, testing and monitoring systems to avoid undesirable behaviors, such as harmful content, bias, misuse and other unintended risks. Over the years, we have made significant investments in building out the necessary governance structure, policies, tools and processes to uphold these principles and build and deploy AI safely. At Microsoft, we are committed to sharing our learnings on this journey of upholding our Responsible AI principles with our customers. We use our own best practices and learnings to provide people and organizations with capabilities and tools to build AI applications that share the same high standards we strive for.

    For more information on Microsoft Cloud for Healthcare and the new data and AI solutions and their impact, visit https://news.microsoft.com/hlth-2024, or visit Microsoft at booth #4004 at HLTH 2024.

    Microsoft (Nasdaq “MSFT” @microsoft) creates platforms and tools powered by AI to deliver innovative solutions that meet the evolving needs of our customers. The technology company is committed to making AI available broadly and doing so responsibly, with a mission to empower every person and every organization on the planet to achieve more.

    1Nursing and midwifery, World Health Organization, 2024

    For more information, press only:

    Microsoft Media Relations, WE Communications, (425) 638-7777, [email protected]

    Note to editors: For more information, news and perspectives from Microsoft, please visit Microsoft Source at https://news.microsoft.com/source. Web links, telephone numbers and titles were correct at time of publication but may have changed. For additional assistance, journalists and analysts may contact Microsoft’s Rapid Response Team or other appropriate contacts listed at https://news.microsoft.com/microsoft-public-relations-contacts.

    This press release includes key announcements on AI-driven healthcare innovations by Microsoft and includes a quote emphasizing AI’s transformative role in healthcare.

    Epic is a registered trademark of Epic Systems Corp.

    MIL OSI Economics –

    January 23, 2025
  • MIL-OSI Economics: Education under siege: How cybercriminals target our schools​​

    Source: Microsoft

    Headline: Education under siege: How cybercriminals target our schools​​

    Introduction | Security snapshot | Threat briefing
    Defending against attacks | Expert profile 

    Education is essentially an “industry of industries,” with K-12 and higher education enterprises handling data that could include health records, financial data, and other regulated information. At the same time, their facilities can host payment processing systems, networks that are used as internet service providers (ISPs), and other diverse infrastructure. The cyberthreats that Microsoft observes across different industries tend to be compounded in education, and threat actors have realized that this sector is inherently vulnerable. With an average of 2,507 cyberattack attempts per week, universities are prime targets for malware, phishing, and IoT vulnerabilities.¹ 

    Security staffing and IT asset ownership also affect education organizations’ cyber risks. School and university systems, like many enterprises, often face a shortage of IT resources and operate a mix of both modern and legacy IT systems. Microsoft observes that in the United States, students and faculty are more likely to use personal devices in education compared to Europe, for example. Regardless of ownership however, in these and other regions, busy users do not always have a security mindset. 

    This edition of Cyber Signals delves into the cybersecurity challenges facing classrooms and campuses, highlighting the critical need for robust defenses and proactive measures. From personal devices to virtual classes and research stored in the cloud, the digital footprint of school districts, colleges, and universities has multiplied exponentially.  

    We are all defenders. 

    A uniquely valuable and vulnerable environment 

    The education sector’s user base is very different from a typical large commercial enterprise. In the K-12 environment, users include students as young as six years old. Just like any public or private sector organization, there is a wide swath of employees in school districts and at universities including administration, athletics, health services, janitorial, food service professionals, and others. Multiple activities, announcements, information resources, open email systems, and students create a highly fluid environment for cyberthreats.

    Virtual and remote learning have also extended education applications into households and offices. Personal and multiuser devices are ubiquitous and often unmanaged—and students are not always cognizant about cybersecurity or what they allow their devices to access.

    Education is also on the front lines confronting how adversaries test their tools and their techniques. According to data from Microsoft Threat Intelligence, the education sector is the third-most targeted industry, with the United States seeing the greatest cyberthreat activity.

    Cyberthreats to education are not only a concern in the United States. According to the United Kingdom’s Department of Science Innovation and Technology 2024 Cybersecurity Breaches Survey, 43% of higher education institutions in the UK reported experiencing a breach or cyberattack at least weekly.² 

    QR codes provide an easily disguised surface for phishing cyberattacks

    Today, quick response (QR) codes are quite popular—leading to increased risks of phishing cyberattacks designed to gain access to systems and data. Images in emails, flyers offering information about campus and school events, parking passes, financial aid forms, and other official communications all frequently contain QR codes. Physical and virtual education spaces might be the most “flyer friendly” and QR code-intensive environments anywhere, given how big a role handouts, physical and digital bulletin boards, and other casual correspondence help students navigate a mix of curriculum, institutional, and social correspondence. This creates an attractive backdrop for malicious actors to target users who are trying to save time with a quick image scan. 

    Recently the United States Federal Trade Commission issued a consumer alert on the rising threat of malicious QR codes being used to steal login credentials or deliver malware.³

    Microsoft Defender for Office 365 telemetry shows that approximately more than 15,000 messages with malicious QR codes are targeted toward the educational sector daily—including phishing, spam, and malware. 

    Legitimate software tools can be used to quickly generate QR codes with embedded links to be sent in email or posted physically as part of a cyberattack. And those images are hard for traditional email security solutions to scan, making it even more important for faculty and students to use devices and browsers with modern web defenses. 

    Targeted users in the education sector may use personal devices without endpoint security. QR codes essentially enable the threat actor to pivot to these devices. QR code phishing (since its purpose is to target mobile devices) is compelling evidence of mobile devices being used as an attack vector into enterprises—such as personal accounts and bank accounts—and the need for mobile device protection and visibility. Microsoft has significantly disrupted QR code phishing attacks. This shift in tactics is evident in the substantial decrease in daily phishing emails intercepted by our system, dropping from 3 million in December 2023 to just 179,000 by March 2024. 

    Source: Microsoft incident response engagements.

    Universities present their own unique challenges. Much of university culture is based on collaboration and sharing to drive research and innovation. Professors, researchers, and other faculty operate under the notion that technology, science—simply knowledge itself—should be shared widely. If someone appearing as a student, peer, or similar party reaches out, they’re often willing to discuss potentially sensitive topics without scrutinizing the source. 

    University operations also span multiple industries. University presidents are effectively CEOs of healthcare organizations, housing providers, and large financial organizations—the industry of industries factor, again. Therefore, top leaders can can be prime targets for anyone attacking those sectors.

    The combination of value and vulnerability found in education systems has attracted the attention of a spectrum of cyberattackers—from malware criminals employing new techniques to nation-state threat actors engaging in old-school spy craft.  

    Microsoft continually monitors threat actors and threat vectors worldwide. Here are some key issues we’re seeing for education systems. 

    Email systems in schools offer wide spaces for compromise 

    The naturally open environment at most universities forces them to be more relaxed in their email hygiene. They have a lot of emails amounting to noise in the system, but are often operationally limited in where and how they can place controls, because of how open they need to be for alumni, donors, external user collaboration, and many other use cases.  

    Education institutions tend to share a lot of announcements in email. They share informational diagrams around local events and school resources. They commonly allow external mailers from mass mailing systems to share into their environments. This combination of openness and lack of controls creates a fertile ground for cyberattacks.

    AI is increasing the premium on visibility and control  

    Cyberattackers recognizing higher education’s focus on building and sharing can survey all visible access points, seeking entry into AI-enabled systems or privileged information on how these systems operate. If on-premises and cloud-based foundations of AI systems and data are not secured with proper identity and access controls, AI systems become vulnerable. Just as education institutions adapted to cloud services, mobile devices and hybrid learning—which introduced new waves of identities and privileges to govern, devices to manage, and networks to segment—they must also adapt to the cyber risks of AI by scaling these timeless visibility and control imperatives.

    Nation-state actors are after valuable IP and high-level connections 

    Universities handling federally funded research, or working closely with defense, technology, and other industry partners in the private sector, have long recognized the risk of espionage. Decades ago, universities focused on telltale physical signs of spying. They knew to look for people showing up on campus taking pictures or trying to get access to laboratories. Those are still risks, but today the dynamics of digital identity and social engineering have greatly expanded the spy craft toolkit. 

    Universities are often epicenters of highly sensitive intellectual property. They may be conducting breakthrough research. They may be working on high-value projects in aerospace, engineering, nuclear science, or other sensitive topics in partnership with multiple government agencies.  

    For cyberattackers, it can be easier to first compromise somebody in the education sector who has ties to the defense sector and then use that access to more convincingly phish a higher value target.  

    Universities also have experts in foreign policy, science, technology, and other valuable disciplines, who may willingly offer intelligence, if deceived in social-engineering cyberattacks employing false or stolen identities of peers and others who appear to be in individuals’ networks or among trusted contacts. Apart from holding valuable intelligence themselves, compromised accounts of university employees can become springboards into further campaigns against wider government and industry targets.

    Nation-state actors targeting education 

    Peach Sandstorm

    Peach Sandstorm has used password spray attacks against the education sector to gain access to infrastructure used in those industries, and Microsoft has also observed the organization using social engineering against targets in higher education.  

    Mint Sandstorm 

    Microsoft has observed a subset of this Iranian attack group targeting high-profile experts working on Middle Eastern affairs at universities and research organizations. These sophisticated phishing attacks used social engineering to compel targets to download malicious files including a new, custom backdoor called MediaPl. 

    Mabna Institute  

    In 2023, the Iranian Mabna Institute conducted intrusions into the computing systems of at least 144 United States universities and 176 universities in 21 other countries.  

    The stolen login credentials were used for the benefit of Iran’s Islamic Revolutionary Guard Corps and were also sold within Iran through the web. Stolen credentials belonging to university professors were used to directly access university library systems. 

    Emerald Sleet

    This North Korean group primarily targets experts in East Asian policy or North and South Korean relations. In some cases, the same academics have been targeted by Emerald Sleet for nearly a decade.  

    Emerald Sleet uses AI to write malicious scripts and content for social engineering, but these attacks aren’t always about delivering malware. There’s also an evolving trend where they simply ask experts for policy insight that could be used to manipulate negotiations, trade agreements, or sanctions. 

    Moonstone Sleet 

    Moonstone Sleet is another North Korean actor that has been taking novel approaches like creating fake companies to forge business relationships with educational institutions or a particular faculty member or student.  

    One of the most prominent attacks from Moonstone Sleet involved creating a fake tank-themed game used to target individuals at educational institutions, with a goal to deploy malware and exfiltrate data. 

    Storm-1877  

    This actor largely engages in cryptocurrency theft using a custom malware family that they deploy through various means. The ultimate goal of this malware is to steal crypto wallet addresses and login credentials for crypto platforms.  

    Students are often the target for these attacks, which largely start on social media. Storm-1877 targets students because they may not be as aware of digital threats as professionals in industry. 

    A new security curriculum 

    Due to education budget and talent constraints and the inherent openness of its environment, solving education security is more than a technology problem. Security posture management and prioritizing security measures can be a costly and challenging endeavor for these institutions—but there is a lot that school systems can do to protect themselves.  

    Maintaining and scaling core cyberhygiene will be key to securing school systems. Building awareness of security risks and good practices at all levels—students, faculty, administrators, IT staff, campus staff, and more—can help create a safer environment.  

    For IT and security professionals in the education sector, doing the basics and hardening the overall security posture is a good first step. From there, centralizing the technology stack can help facilitate better monitoring of logging and activity to gain a clearer picture into the overall security posture and any vulnerabilities. 

    Oregon State University 

    Oregon State University (OSU), an R1 research-focused university, places a high priority on safeguarding its research to maintain its reputation. In 2021, it experienced an extensive cybersecurity incident unlike anything before. The cyberattack revealed gaps in OSU’s security operations.

    “The types of threats that we’re seeing, the types of events that are occurring in higher education, are much more aggressive by cyber adversaries.”

    —David McMorries, Chief Information Security Officer at Oregon State University

    In response to this incident, OSU created its Security Operations Center (SOC), which has become the centerpiece of the university’s security effort. AI has also helped automate capabilities and helped its analysts, who are college students, learn how to quickly write code—such as threat hunting with more advanced hunting queries. 

    Arizona Department of Education 

    A focus on Zero Trust and closed systems is an area that the Arizona Department of Education (ADE) takes further than the state requirements. It blocks all traffic from outside the United States from its Microsoft 365 environment, Azure, and its local datacenter.

    “I don’t allow anything exposed to the internet on my lower dev environments, and even with the production environments, we take extra care to make sure that we use a network security group to protect the app services.”

    —Chris Henry, Infrastructure Manager at the Arizona Department of Education 

    Follow these recommendations:  

    • The best defense against QR code attacks is to be aware and pay attention. Pause, inspect the code’s URL before opening it, and don’t open QR codes from unexpected sources, especially if the message uses urgent language or contains errors. 
    • Consider implementing “protective domain name service,” a free tool that helps prevent ransomware and other cyberattacks by blocking computer systems from connecting to harmful websites. Prevent password spray attacks with a stringent password and deploy multifactor authentication.  
    • Educate students and staff about their security hygiene, and encourage them to use multifactor authentication or passwordless protections. Studies have shown that an account is more than 99.9% less likely to be compromised when using multifactor authentication.   

    Corey Lee has always had an interest in solving puzzles and crimes. He started his college career at Penn State University in criminal justice, but soon realized his passion for digital forensics after taking a course about investigating a desktop computer break-in.  

    After completing his degree in security and risk analysis, Corey came to Microsoft focused on gaining cross-industry experience. He’s worked on securing everything from federal, state, and local agencies to commercial enterprises, but today he focuses on the education sector.  

    After spending time working across industries, Corey sees education through a different lens—the significantly unique industry of industries. The dynamics at play inside the education sector include academic institutions, financial services, critical infrastructure like hospitals and transportation, and partnerships with government agencies. According to Corey, working in such a broad field allows him to leverage skillsets from multiple industries to address specific problems across the landscape. 

    The fact that education could also be called underserved from a cybersecurity standpoint is another compelling challenge, and part of Corey’s personal mission. The education industry needs cybersecurity experts to elevate the priority of protecting school systems. Corey works across the public and industry dialogue, skilling and readiness programs, incident response, and overall defense to protect not just the infrastructure of education, but students, parents, teachers, and staff. 

    Today, Corey is focused reimagining student security operations centers, including how to inject AI into the equation and bring modern technology and training to the table. By growing the cybersecurity work force in education and giving them new tools, he’s working to elevate security in the sector in a way that’s commensurate with how critical the industry is for the future. 

    Next steps with Microsoft Security

    To learn more about Microsoft Security solutions, visit our website. Bookmark the Security blog to keep up with our expert coverage on security matters. Also, follow us on LinkedIn (Microsoft Security) and X (@MSFTSecurity) for the latest news and updates on cybersecurity.


    ¹The Institutional Impacts of a Cyberattack, University of Florida Information Technology. January 18, 2024.

    ²Cyber security breaches survey 2024: education institutions annex, The United Kingdom Department for Science, Innovation & Technology. April 9, 2024

    ³Scammers hide harmful links in QR codes to steal your information, Federal Trade Commission (Alvaro Puig), December 6, 2023.

    Methodology: Snapshot and cover stat data represent telemetry from Microsoft Defender for Office 365 showing how a QR code phishing attack was disrupted by image detection technology and how Security Operations teams can respond to this threat. Platforms like Microsoft Entra provided anonymized data on threat activity, such as malicious email accounts, phishing emails, and attacker movement within networks. Additional insights are from the 78 trillion daily security signals processed by Microsoft each day, including the cloud, endpoints, the intelligent edge, and telemetry from Microsoft platforms and services including Microsoft Defender. Microsoft categorizes threat actors into five key groups: influence operations; groups in development; and nation-state, financially motivated, and private sector offensive actors. The new threat actors naming taxonomy aligns with the theme of weather.  

    © 2024 Microsoft Corporation. All rights reserved. Cyber Signals is for informational purposes only. MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED OR STATUTORY, AS TO THE INFORMATION IN THIS DOCUMENT. This document is provided “as is.” Information and views expressed in this document, including URL and other Internet website references, may change without notice. You bear the risk of using it. This document does not provide you with any legal rights to any intellectual property in any Microsoft product. 

    MIL OSI Economics –

    January 23, 2025
  • MIL-OSI USA: SEC Charges Rimar Capital Entities and Owner Itai Liptz for Defrauding Investors by Making False and Misleading Statements About Use of Artificial Intelligence

    Source: Securities and Exchange Commission

    Rimar Capital USA, Inc. Board Member Clifford Boro also Charged

    The Securities and Exchange Commission today announced charges against Rimar Capital USA, Inc. (Rimar USA), Rimar Capital, LLC (Rimar LLC), Itai Liptz, and Clifford Boro for making false and misleading statements about Rimar LLC’s purported use of artificial intelligence, or AI, to perform automated trading for client accounts and numerous other material misrepresentations. The parties agreed to settle the SEC’s charges and pay $310,000 in total civil penalties.

    According to the SEC order, Liptz, owner and CEO of Rimar LLC and Rimar USA, with the help of Boro, a Rimar USA board member, raised nearly $4 million from 45 investors for the development of Rimar LLC, an investment adviser that was falsely described as having an AI-driven platform for trading securities. The order found that the Rimar entities, Liptz, and Boro also made misrepresentations about Rimar LLC’s assets under management and its investment returns. In addition, the order found that Rimar LLC and Liptz obtained advisory clients using the misleading statements and that Liptz misappropriated company funds for personal expenses.

    “Through entities he controlled, Liptz lured investors and clients with multiple fabrications, including with buzzwords about the latest AI technology,” said Andrew Dean, Co-Chief of the SEC’s Asset Management Unit. “As AI becomes more popular in the investing space, we will continue to be vigilant and pursue those who lie about their firms’ technological capabilities and engage in ‘AI washing’.”

    Without admitting or denying the SEC’s findings, Rimar USA, Rimar LLC, Liptz, and Boro consented to the entry of an order finding antifraud violations and to cease and desist from violating the charged provisions. Liptz consented to pay disgorgement and prejudgment interest totaling $213,611, to pay a $250,000 civil penalty, and to be subject to an investment company prohibition and associational bar with the right to reapply in five years. Boro agreed to pay a $60,000 civil penalty. Rimar LLC consented to be censured.

    The SEC’s Office of Investor Education and Advocacy has issued an Investor Alert about AI and investment fraud.

    The SEC’s investigation was conducted by Payam Danialypour under the supervision of Brent Wilner, Associate Regional Director of the Los Angeles Regional Office, and Mr. Dean. Roberto Grasso of the Division of Examinations, Office of Risk and Strategy assisted with the investigation.

    MIL OSI USA News –

    January 23, 2025
  • MIL-OSI USA: U.S. hourly electricity demand peaked in July with widespread heatwaves

    Source: US Energy Information Administration

    In-brief analysis

    October 10, 2024

    Data source: U.S. Energy Information Administration, Hourly Electric Grid Monitor
    Note: Chart shows maximum electricity demand each day based on hourly data converted to Eastern Daylight Time.

    Last summer, U.S. electricity demand in the Lower 48 states was greatest at 6:00 p.m. Eastern Daylight Time on July 15, 2024, when it reached about 745 gigawatthours (GWh), based on data in our Hourly Electric Grid Monitor. In our analysis, we calculate each day’s peak according to the hour with the highest electricity demand. This year’s U.S. summer hourly peak (745 GWh) was essentially the same as in 2023 (742 GWh) and in 2022 (743 GWh). On the other hand, U.S. generation from January through July was about 2,500 terawatthours (TWh), 4% more than the 2,397 TWh generated in the same period last year, according to our Electricity Power Monthly.

    U.S. electricity demand tends to peak in July or August as air-conditioning use ramps up. Temperatures in July were above average for much of the United States, especially in parts of the West, Northeast, and Southeast, according to the Monthly National Climate Report for July 2024 from the National Oceanic and Atmospheric Administration’s (NOAA) National Centers for Environmental Information.

    Although the peak hourly electric generation in the contiguous United States was mostly flat year on year, certain regions experienced higher year-over-year peak demand based on local weather, power grid conditions, and available electricity supply.

    The U.S. electricity system is composed of three major grids: the Eastern Interconnection, Western Interconnection, and the Electric Reliability Council of Texas (ERCOT). Within each power grid are balancing authorities, which include utilities, cooperatives, and other entities, that ensure enough electricity is available to meet customer needs. If electricity supply and demand are imbalanced, local or widespread blackouts can occur.

    East
    Across the Eastern Interconnection, hourly electricity demand peaked on July 15 at about 549 GWh, as temperatures were well above average in several East Coast states that month, according to NOAA. Daily high temperatures stayed above triple digits for several consecutive days in some metropolitan areas. For instance, both Baltimore, Maryland, and Washington, DC, experienced high temperatures of 100°F or above from July 14 to 17.

    Electricity demand in an hour on August 1 came close to July’s peak, reaching about 540 GWh, but demand was curbed by the rain and power outages due to Hurricane Debby, which moved up the East Coast from August 4 to 10.

    Data source: U.S. Energy Information Administration, Hourly Electric Grid Monitor
    Note: Chart shows maximum electricity demand each day based on hourly data converted to Eastern Daylight Time and excludes electricity demand in Canadian provinces.

    Texas
    In Texas, hourly electricity demand peaked on August 20, reaching about 86 GWh, which is virtually the same as the previous all-time daily peak of 85 GWh reached in August 2023.

    Although electricity demand reached 81 GWh in an hour on July 1, demand fell by about a third to 55 GWh by July 8, when Hurricane Beryl reached the Texas coastline.

    Data source: U.S. Energy Information Administration, Hourly Electric Grid Monitor
    Note: Chart shows maximum electricity demand each day based on hourly data converted to Central Daylight Time. ERCOT=Electric Reliability Council of Texas

    West
    In the Western Interconnection, hourly electricity demand peaked on July 10 at about 141 GWh. This amount excludes British Columbia and Alberta, which are part of the regional grid.

    Data source: U.S. Energy Information Administration, Hourly Electric Grid Monitor
    Note: Chart shows maximum electricity demand each day based on hourly data converted to Pacific Daylight Time and excludes electricity demand in Canadian provinces.

    The California power grid operator, California Independent System Operator (CAISO), reported similar results for the full Western Interconnection including British Columbia and Alberta. With the two Canadian provinces, electricity demand reached about 168 GWh on July 10, setting a new record.

    Although California saw record-breaking temperatures this past summer, CAISO said electricity demand on its system, which also covers part of Nevada, peaked on July 24 at about 45 GWh, which was less than the record of 52 GWh that occurred on September 6, 2022.

    Principal contributors: Stephanie Tsao, Mark Morey

    MIL OSI USA News –

    January 23, 2025
  • MIL-Evening Report: AI affects everyone – including Indigenous people. It’s time we have a say in how it’s built

    Source: The Conversation (Au and NZ) – By Tamika Worrell, Senior Lecturer in the Department of Critical Indigenous Studies, Macquarie University

    Since artificial intelligence (AI) became mainstream over the past two years, many of the risks it poses have been widely documented. As well as fuelling deep fake porn, threatening personal privacy and accelerating the climate crisis, some people believe the emerging technology could even lead to human extinction.

    But some risks of AI are still poorly understood. These include the very particular risks to Indigenous knowledges and communities.

    There’s a simple reason for this: the AI industry and governments have largely ignored Indigenous people in the development and regulation of AI technologies. Put differently, the world of AI is too white.

    AI developers and governments need to urgently fix this if they are serious about ensuring everybody shares the benefits of AI. As Aboriginal and Torres Strait Islander people like to say, “nothing about us, without us”.

    Indigenous concerns

    Indigenous peoples around the world are not ignoring AI. They are having conversations, conducting research and sharing their concerns about the current trajectory of it and related technologies.

    A well-documented problem is the theft of cultural intellectual property. For example, users of AI image generation programs such as DeepAI can artificially generate artworks in mere seconds which mimic Indigenous styles and stories of art.

    This demonstrates how easy it is for someone using AI to misappropriate cultural knowledges. These generations are taken from large data sets of publicly available imagery to create something new. But they miss the storying and cultural knowledge present in our art practices.

    AI technologies also fuel the spread of misinformation about Indigenous people.

    The internet is already riddled with misinformation about Indigenous people. The long-running Creative Spirits website, which is maintained by a non-Indigenous person, is a prominent example.

    Generative AI systems are likely to make this problem worse. They often conflate us with other global Indigenous peoples around the world. They also draw on inappropriate sources, including Creative Spirits.

    During last year’s Voice to Parliament referendum in Australia, “no” campaigners also used AI-generated images depicting Indigenous people. This demonstrates the role of AI in political contexts and the harm it can cause to us.

    Another problem is the lack of understanding of AI among Indigenous people. Some 40% of the Aboriginal and Torres Strait Islander population in Australia don’t know what generative AI is. This reflects an urgent need to provide relevant information and training to Indigenous communities on the use of the technology.

    There is also concern about the use of AI in classroom contexts and its specific impact on Indigenous students.

    Looking to the future

    Hawaiian and Samoan Scholar Jason Lewis says:

    We must think more expansively about AI and all the other computational systems in which we find ourselves increasingly enmeshed. We need to expand the operational definition of intelligence used when building these systems to include the full spectrum of behaviour we humans use to make sense of the world.

    Key to achieving this is the idea of “Indigenous data sovereignty”. This would mean Indigenous people retain sovereignty over their own data, in the sense that they own and control access to it.

    In Australia, a collective known as Maiam nayri Wingara offers important considerations and principles for data sovereignty and governance. They affirm Indigenous rights to govern and control our data ecosystems, from creation to infrastructure.

    The National Agreement on Closing the Gap also affirms the importance of Indigenous data control and access.

    This is reaffirmed at a global level as well. In 2020, a group of Indigenous scholars from around the world published a position paper laying out how Indigenous protocols can inform ethically created AI. This kind of AI would centralise the knowledges of Indigenous peoples.

    In a positive step, the Australian government’s recently proposed set of AI guardrails highlight the importance of Indigenous data sovereignty.

    For example, the guardrails include the need to ensure additional transparency and make extra considerations when it comes to using data about or owned by Aboriginal and Torres Strait Islander people, to “mitigate the perpetuation of existing social inequalities”.

    Indigenous Futurisms

    Grace Dillon, a scholar from a group of North American Indigenous people known as the Anishinaabe, first coined the term “Indigenous Futurisms”.

    Ambelin Kwaymullina, an academic and futurist practitioner from the Palyku nation in Western Australia, defines it as:

    visions of what-could-be that are informed by ancient Aboriginal cultures and by our deep understandings of oppressive systems.

    These visions, Kwaymullina writes, are “as diverse as Indigenous peoples ourselves”. They are also unified by “an understanding of reality as living, interconnected whole in which human beings are but one strand of life amongst many, and a non-linear view of time”.

    So how can AI technologies be informed by Indigenous ways of knowing?

    A first step is for industry to involve Indigenous people in creating, maintaining and evaluating the technologies – rather than asking them retrospectively to approve work already done.

    Governments need to also do more than highlight the importance of Indigenous data sovereignty in policy documents. They need to meaningfully consult with Indigenous peoples to regulate the use of these technologies. This consultation must aim to ensure ethical AI behaviour among organisations and everyday users that honours Indigenous worldviews and realities.

    AI developers and governments like to claim they are serious about ensuring AI technology benefits all of humanity. But unless they start involving Indigenous people more in developing and regulating the technology, their claims ring hollow.

    Tamika Worrell does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

    – ref. AI affects everyone – including Indigenous people. It’s time we have a say in how it’s built – https://theconversation.com/ai-affects-everyone-including-indigenous-people-its-time-we-have-a-say-in-how-its-built-239605

    MIL OSI Analysis – EveningReport.nz –

    January 23, 2025
  • MIL-Evening Report: Use of AI in property valuation is on the rise – but we need greater transparency and trust

    Source: The Conversation (Au and NZ) – By William Cheung, Senior Lecturer, Business School, University of Auckland, Waipapa Taumata Rau

    New Zealand’s economy has been described as a “housing market with bits tacked on”. Buying and selling property is a national sport fuelled by the rising value of homes across the country.

    But the wider public has little understanding of how those property valuations are created – despite their being a key factor in most banks’ decisions about how much they are willing to lend for a mortgage.

    Automated valuation models (AVM) – systems enabled by artificial intelligence (AI) that crunch vast datasets to produce instant property values – have done little to improve transparency in the process.

    These models started gaining traction in New Zealand in the early 2010s. The early versions used limited data sources like property sales records and council information. Today’s more advanced models include high-quality geo-spatial data from sources such as Land Information New Zealand.

    AI models have improved efficiency. But the proprietary algorithms behind those AVMs can make it difficult for homeowners and industry professionals to understand how specific values are calculated.

    In our ongoing research, we are developing a framework that evaluates these automated valuations. We have looked at how the figures should be interpreted and what factors might be missed by the AI models.

    In a property market as geographically and culturally varied as New Zealand’s, these points are not only relevant — they are critical. The rapid integration of AI into property valuation is no longer just about innovation and speed. It is about trust, transparency and a robust framework for accountability.

    AI valuations are a black box

    In New Zealand, property valuation has traditionally been a labour-intensive process. Valuers would usually inspect properties, make market comparisons and apply their expert judgement to arrive at a final value estimate.

    But this approach is slow, expensive and prone to human error. As demand for more efficient property valuations increased, the use of AI brought in much-needed change.

    But the rise of these valuations models is not without its challenges. While AI offers speed and consistency, it also comes with a critical downside: a lack of transparency.

    AVMs often operate as “black boxes”, providing little insight into the data and methodologies that drive their valuations. This raises serious concerns about the consistency, objectivity and transparency of these systems.

    What exactly the algorithm is doing when an AVM estimates a home’s value is not clear. Such opaqueness has real-world consequences, perpetuating market imbalances and inequities.

    Without a framework to monitor and correct these discrepancies, AI models risk distorting the property market further, especially in a country as diverse as New Zealand, where regional, cultural and historical factors significantly influence property values.

    Transparency and accountability

    A recent discussion forum with real estate industry insiders, law researchers and computer scientists on AI governance and property valuations highlighted the need for greater accountability when it comes to AVMs. Transparency alone is not enough. Trust must be built into the system.

    This can be achieved by requiring AI developers and users to disclose data sources, algorithms and error margins behind their valuations.

    Additionally, valuation models should incorporate a “confidence interval” – a range of prices that shows how much the estimated value might vary. This offers users a clearer understanding of the uncertainty inherent in each valuation.

    But effective AI governance in property valuation cannot be achieved in isolation. It demands collaboration between regulators, AI developers and property professionals.

    Bias correction

    New Zealand urgently needs a comprehensive evaluation framework for AVMs, one that prioritises transparency, accountability and bias correction.

    This is where our research comes in. We repeatedly resample small portions of the data to account for situations where property value data do not follow a normal distribution.

    This process generates a confidence interval showing a range of possible values around each property estimate. Users are then able to understand the variability and reliability of the AI-generated valuations, even when the data are irregular or skewed.

    Our framework goes beyond transparency. It incorporates a bias correction mechanism that detects and adjusts for constantly overvalued or undervalued estimates within AVM outputs. One example of this relates to regional disparities or undervaluation of particular property types.

    By addressing these biases, we ensure valuations that are not only accountable or auditable but also fair. The goal is to avoid the long-term market distortions that unchecked AI models could create.

    The rise of AI auditing

    But transparency alone is not enough. The auditing of AI-generated information is becoming increasingly important.

    New Zealand’s courts now require a qualified person to check information generated by AI and subsequently used in tribunal proceedings.

    In much the same way financial auditors ensure accuracy in accounting, AI auditors will play a pivotal role in maintaining the integrity of valuations.

    Based on earlier research, we are auditing the artificial valuation model estimates by comparing them with the market transacted prices of the same houses in the same period.

    It is not just about trusting the algorithms but trusting the people and systems behind them.

    The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.

    – ref. Use of AI in property valuation is on the rise – but we need greater transparency and trust – https://theconversation.com/use-of-ai-in-property-valuation-is-on-the-rise-but-we-need-greater-transparency-and-trust-240880

    MIL OSI Analysis – EveningReport.nz –

    January 23, 2025
  • MIL-Evening Report: The Texas Chain Saw Massacre and its harrowing, visceral impact has been rarely matched, 50 years on

    Source: The Conversation (Au and NZ) – By Nicholas Godfrey, Senior Lecturer, College of Humanities, Arts and Social Sciences, Flinders University

    The Texas Chain Saw Massacre is a product of a unique time in American filmmaking, when independent exploitation films were nastier than ever, and equally capable of piercing the mainstream consciousness.

    Tobe Hooper’s 1974 film arrived in a recently transformed exhibition landscape. The 1967 outcry over onscreen violence in Bonnie and Clyde marked the end of Hollywood’s Motion Picture Production Code and the introduction of film ratings.

    Films like Easy Rider (1969) elevated the standing of formerly disreputable exploitation fare within Hollywood. By 1973, The Exorcist was packing out cinemas and producing lines around city blocks with the promise of the most unremitting horror film yet made.

    The Texas Chain Saw Massacre was shot quickly on a shoestring budget, financed in part by the newly-formed Texas Film Commission. The film assembled its cast and crew from Austin’s circles of recent college graduates and dropouts.

    Its plot is straightforward enough: a group of young people are stranded when they run out of gas in rural Texas. They are terrorised and subsequently murdered by an eccentric local family, including the chainsaw wielding Leatherface – a nonverbal, childlike giant who wears masks made from the skin of his flayed victims.

    We learn this family have lost their jobs at the local slaughterhouse with the introduction of bolt gun technologies, leaving them sell roadside meat made from human victims.

    This detail has inspired a range of thematic interpretations for the film, encompassing commentary on class and family, gender and animal rights.

    The film lays bare the horrors of meat production, inflicted on human victims. The family home is the site where these themes come into conflict.

    Porn and violence on screen

    The Texas Chain Saw Massacre was picked up by the Bryanston Distributing Company. In 1972, Bryanston was the distributor for the theatrical release of the hardcore pornographic film Deep Throat. The film’s success shifted popular discourse around pornography, and helped Bryanston widen the theatrical release for The Texas Chain Saw Massacre.

    In subsequent years, media reported on alleged abusive on-set conditions on Deep Throat, along with claims Bryanston was connected with organised crime. Director Hooper, and many of the Chain Saw Massacre cast, alleged they never received their share of box office from the distributor.

    A 1974 poster.
    Ralf Liebhold/Shutterstock

    The Texas Chain Saw Massacre’s proximity to Deep Throat stoked controversy, conflating concern about increasingly extreme depictions of sex and violence onscreen.

    Two years earlier, young filmmaker Wes Craven had transitioned from making pornography to horror film. His low budget rape-revenge exploitation film The Last House on the Left (1972) was originally developed as a hardcore pornographic film. This approach was abandoned when it entered production.

    Unlike Craven’s notorious film, The Texas Chain Saw Massacre is not overtly sexualised. While there may be a sexual undertone to Leatherface’s pursuit of Sally and her companions, it does not escalate to onscreen acts of sexual violence.

    Regardless, the film drew condemnation, particularly in the United Kingdom, where it was banned, and later figured in public debates about the censorship of “video nasties” in the 1980s.

    For my part, I remember encountering The Texas Chain Saw Massacre at the video rental store as a child: its title, cover and R-rating promised horrors beyond comprehension, many years before I actually saw the film itself.

    Horrors implied, rather than shown

    Beyond its controversies, The Texas Chain Saw Massacre played an important role in the developing field of horror film studies. It figures prominently in Robin Wood’s taxonomy of “reactionary” horror movies (which uphold traditional values) and “progressive” horror movies, which take a more ambivalent stance on the figure of the monster, challenging conservative social values. Wood counts The Texas Chain Saw Massacre in the latter category.

    It is also central to Carol J. Clover’s influential codification of the “final girl” narrative trope, in which a sole young woman is able to withstand the monster’s onslaught.

    Alongside Halloween (1978), The Texas Chain Saw Massacre helped steer the trajectory of American horror films in the 1980s.

    Halloween is situated within the manicured surroundings of suburbia, and conveys its menace through the slick technical qualities of its gliding camera, and John Carpenter’s staccato synth score.

    By contrast, The Texas Chain Saw Massacre locates its horror in the backroads and decrepit farmhouses of central Texas. The idea of Texas looms large, connoting a place of lawlessness, violence and danger.

    Hooper punctuates his long shots with extreme close ups via rapid editing. The film’s most grotesque horrors are implied, rather than shown. Its most visceral impact comes from its extended chase sequences, and via its soundtrack: Sally’s piercing screams, and Leatherface’s ever-present chainsaw.

    While the Texas Chain Saw Massacre spawned several sequels and influenced even more imitators over the years, from the Ramones to Wolf Creek (2005) and X (2022), it has rarely been matched in its intensity, and its harrowing, visceral impact.

    Nicholas Godfrey does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

    – ref. The Texas Chain Saw Massacre and its harrowing, visceral impact has been rarely matched, 50 years on – https://theconversation.com/the-texas-chain-saw-massacre-and-its-harrowing-visceral-impact-has-been-rarely-matched-50-years-on-236700

    MIL OSI Analysis – EveningReport.nz –

    January 23, 2025
  • MIL-OSI USA News: FACT SHEET: Biden-⁠ Harris Administration Celebrates International Day of the Girl and Continues Commitment to Supporting Youth in the U.S. and  Abroad

    Source: The White House

    International Day of the Girl provides an opportunity to celebrate the leadership of girls around the world and recommit to addressing the barriers that continue to limit their full participation. Today, to commemorate International Day of the Girl, First Lady Jill Biden will host the second “Girls Leading Change” event at the White House to recognize outstanding young women from across the United States who are making a difference in their communities. This year’s event will honor 10 young women leaders, selected by the White House Gender Policy Council, who are leading change and shaping a brighter future for generations to come.  

    The Biden-Harris Administration is committed to ensuring that girls can pursue their dreams free from fear, discrimination, violence, or abuse; and to advancing the safety, education, health, and wellbeing of girls everywhere. Investing in young people means investing in our future; and they should have the opportunity and resources they need to succeed.

    That’s why, since day one in office, this Administration has taken action to advance the safety, education, health, and well-being of girls, including:

    • Accelerating Learning and Improving Student Achievement. The American Rescue Plan, the largest one-time education investment in our history, included $130 billion to help schools address the impact of the pandemic on student well-being and academic achievement. To sustain these efforts, the Biden-Harris Administration increased funding and targeting of federal grants to better support academic recovery—from the Education Innovation and Research program to extended-day and afterschool programming through 21st Century Community Learning Centers. And the Administration’s Improving Student Achievement Agenda for 2024 is helping accelerate academic performance for every child in school.
    • Canceling Student Debt. President Biden and Vice President Harris vowed to fix the federal student loan program and make sure higher education is a ticket to the middle class—not a barrier to opportunity. The Biden-Harris Administration has approved nearly $170 billion in loan forgiveness for almost 5 million borrowers through more than two dozen executive actions with the goal of helping these borrowers get more breathing room in their daily lives, access economic mobility, buy homes, start businesses, and pursue their dreams.
    • Cutting Child Poverty Nearly in Half in 2021. President Biden and Vice President Harris believe that no child should grow up in poverty. Their expansion of the Child Tax Credit helped cut child poverty nearly in half in 2021 to a record low of 5.2%. President Biden and Vice President Harris are fighting to restore this expansion, which would lift over a million girls out of poverty and narrow racial disparities. The Biden-Harris Administration has also lifted hundreds of thousands of girls out of poverty by updating the Thrifty Food Plan and creating SunBucks, a new program that helps low-income families afford groceries over the summer when they don’t have access to school meals.
    • Supporting Youth Mental Health. President Biden and Vice President Harris believe that health care is a right, not a privilege, and that mental health care is health care—period. That’s why they invested almost $1.5 billion to strengthen the 988 Suicide & Crisis Lifeline and launched the National Mental Health Strategy, with ongoing investments to strengthen the mental health workforce, ensure parity for mental health and substance use care, connect Americans to care, and better protect youth from the harms of social media. The Biden-Harris Administration is also delivering the largest investments in school-based mental health services ever, bringing 14,000 new mental health professionals into schools across the country and making it easier for schools to leverage Medicaid to deliver care.
       
    • Preventing Gun Violence, Including Domestic Violence with Firearms. Gun violence is the leading killer of children and teenagers in the United States. President Biden and Vice President Harris have taken historic executive action to reduce gun violence and violent crime. In 2022, President Biden signed into law the Bipartisan Safer Communities Act (BSCA), the most significant new gun safety legislation in nearly 30 years. The intersection between guns and domestic violence can be especially deadly, and BSCA expanded background checks to keep guns out of the hands of more domestic abusers, narrowed the “boyfriend loophole” so an individual convicted of a misdemeanor crime of domestic violence against a dating partner is prohibited from purchasing a firearm, and expanded funding for red flag laws that allow for temporary removal of firearms from an individual who is a danger to themselves or others. President Biden established the first-ever Office of Gun Violence Prevention, overseen by Vice President Harris. The Biden-Harris Administration has made historic investments in law enforcement and community-led crime prevention and intervention strategies and has announced more executive actions to reduce gun violence than any other administration. Most recently, building on life-saving actions that the Administration has already taken, President Biden signed a new Executive Order in September 2024 to improve school-based active shooter drills and combat emerging firearms threats. The President and Vice President also announced new actions to support survivors of gun violence, promote safe gun storage, fund community violence intervention, and improve the gun background check system, among other actions.
       
    • Launching the American Climate Corps. President Biden launched the American Climate Corps to give a diverse new generation of young people the tools to fight the impacts of climate change today and the skills to join the clean energy and climate-resilience workforce of tomorrow. The American Climate Corps is tackling the climate crisis, including by restoring coastal ecosystems, strengthening urban and rural agriculture, investing in clean energy and energy efficiency, improving disaster and wildfire preparedness, and more. More than 15,000 young Americans have already been put to work in high-quality, good-paying clean energy and climate resilience workforce training and service opportunities through the American Climate Corps—putting the program on track to reach President Biden’s goal of 20,000 members in the program’s first year ahead of schedule.
       
    • Providing Children with Healthier, More Sustainable Environments. The Environmental Protection Agency’s Clean School Bus Program has awarded nearly $3 billion and funded approximately 8,700 electric and low-emission school buses nationwide, protecting children from air pollution by transforming school bus fleets across America. The Biden-Harris Administration also invested $15 billion toward replacing every toxic lead pipe in the country within a decade, protecting children and schools from lead exposure that can cause irreversible harm to cognitive development and hamper children’s learning. And earlier this year, the Environmental Protection Agency provided $58 million to protect children from lead in drinking water at schools and child care facilities.
    • Fighting Online Harassment and Abuse. Online harassment and abuse is increasingly widespread in today’s digitally connected world and disproportionately affects women, girls, and LGBTQI+ individuals. President Biden established the White House Task Force to Address Online Harassment and Abuse to coordinate comprehensive actions from more than a dozen federal agencies, and his Executive Order on artificial intelligence directs federal agencies to address deepfake image-based abuse. The Department of Justice also funded the first-ever national helpline to provide 24/7 support and specialized services for victims of online harassment and abuse, including the non-consensual distribution of intimate images; raised awareness of new legal protections against the non-consensual distribution of intimate images that were included in the Violence Against Women Act Reauthorization Act of 2022; and funded a new National Resource Center on Cybercrimes Against Individuals.
    • Keeping Students Safe and Addressing Campus Sexual Assault. The Department of Education restored and strengthened vital Title IX protections against discrimination on the basis of sex for students and employees. The Department of Justice awarded more than $20 million in FY 2024 to support colleges and universities in preventing and responding to sexual assault, domestic violence, dating violence, and stalking. And the Department of Education—in collaboration with the Departments of Justice and Health and Human Services—launched a Task Force on Sexual Violence in Education that has released data on sexual violence at educational institutions and is working to improve sexual violence prevention and response on campus.
    • Supporting Vulnerable Youth. The Biden-Harris Administration has taken action to support the needs of vulnerable and underserved youth—from helping prevent youth homelessness and human trafficking to supporting employment initiatives for youth with disabilities. This includes $800 million in dedicated funding to support students experiencing homelessness through the President’s American Rescue Plan. The Department of Health and Human Services also issued landmark rules to improve the child welfare system, particularly for the most vulnerable children, and to advance the safety and wellbeing of families across the country, including for LGBTQI+ children in foster care. And the Department of Justice has funded programs to help communities develop, enhance, or expand early intervention programs and treatment services for girls who are involved in the juvenile justice system.

    The Biden-Harris Administration has also taken action to support girls around the globe by fighting to advance the human rights of women and girls and promote access to education, health, and safety, including:

    • Promoting Girls’ Education Globally. The United States is investing in girls’ education around the world, which in turn advances health and economic development. The U.S. Agency for International Development (USAID) invested more than $2.5 billion from FY 2021-2023 to increase access to quality basic and higher education, and reached 18.7 million girls and women in 69 countries in FY23 alone to advance gender equality in and through education. The Departments of State and Labor have also supported efforts to promote girls’ education through science, technology, engineering, and mathematics (STEM) education programs in Kenya and Namibia, as well as technical and vocational education training centers for adolescent girls in Ethiopia. The United States has strongly condemned the restriction of girls’ education in Afghanistan, including by restricting visas for individuals believed to be responsible for, or complicit in, repressing women and girls by limiting or prohibiting access to education.
    • Closing the Gender Digital Divide. Last year, Vice President Harris launched the Women in the Digital Economy Fund (Wi-DEF) to accelerate progress towards closing the gender digital divide. To date, Wi-DEF has raised over $80 million, including an initial $50 million commitment from USAID. Building on the success of the Fund, the Women in the Digital Economy Initiative includes commitments from governments, private sector companies, foundations, civil society, and multilateral organizations that have pledged more than $1 billion to accelerate gender digital equality. This Initiative supports girls’ access to digital learning opportunities, provides employment and educational skills, and helps fulfill the historic commitment of G20 Leaders to halve the digital gender gap by 2030. Since the launch of Wi-DEF, the United States has invested $102 million in direct and aligned commitments to closing the gender digital divide and accelerating gender digital equality.
    • Preventing and Responding to Online Harassment and Abuse Globally. To address the scourge of online harassment and abuse against girls and women, the Biden-Harris Administration launched the 15-country Global Partnership for Action on Gender-Based Online Harassment and Abuse, which has advanced international policies to address online safety and supported programs to prevent and respond to technology-facilitated gender-based violence. Since the Global Partnership was launched in 2022, the Department of State has supported projects in every region to prevent, document, and address technology-facilitated gender-based violence, cultivate safe online use, and respond to survivors’ needs. 
    • Championing Girls’ Leadership in Addressing the Climate Crisis. In 2023, Vice President Harris announced the Women in the Sustainable Economy Initiative—an over $2 billion public-private partnership to promote women’s access to jobs in the green and blue industries of the future—including by advancing girls’ access to STEM education. Through WISE, the Department of State is investing more than $12 million in programs to benefit girls, including programs that promote girls’ economic skills and opportunities in STEM and that foster girls’ roles in leading, shaping, and informing equitable and inclusive climate policies and actions.
    • Strengthening HIV Prevention Services for Girls. To address key factors that make adolescent girls and young women particularly vulnerable to HIV, the United States launched the DREAMS (Determined, Resilient, Empowered, AIDS-free, Mentored, and Safe) public-private partnership as part of the President’s Emergency Plan for AIDS Relief (PEPFAR) in 2014. Announced in 2023, PEPFAR’s DREAMS NextGen program is the next phase of DREAMS that will take a more nuanced approach that is responsive to the current context within each of the 15 DREAMS countries. PEPFAR has invested more than $2 billion in comprehensive HIV prevention programming for girls through DREAMS—including $1.3 billion since the start of the Administration—and the program reaches approximately 2.5 to 3 million girls annually.
    • Increasing Efforts to End Child Marriage Globally. To address the global scourge of child, early, and forced marriage, USAID and the Department of State invested $86 million in 27 countries to support programs that prevent and respond to this harmful practice, including by equipping girls and young women with education and workforce readiness skills; providing education, health, legal, and economic support; and raising awareness. Under the leadership of the Biden-Harris Administration, the United States also made its first-ever contribution to the UNICEF-UNFPA Global Programme to End Child Marriage, which works in 12 countries in Africa and South Asia to promote the rights of adolescent girls, and is contributing more than $2 million in FY 2024 to UNFPA to help reach refugee adolescent girls and prevent child marriages in humanitarian settings.
    • Leading Programs to End Female Genital Mutilation and Cutting. To address the harmful practice of female genital mutilation and cutting (FGM/C), USAID invested in programs to address this issue in Djibouti, Egypt, Mauritania, and Nigeria. The United States is a long-standing donor to the UNICEF-UNFPA Joint Programme on the Elimination of Female Genital Mutilation, and invested $20 million from FY 2020-FY 2023 in this partnership, which has succeeded in advocating for legal and policy frameworks banning FGM/C in 14 of 17 countries and supported more than 6.3 million women and girls with FGM/C-related protection and care services.
    • Promoting Young Women’s Civic and Political Participation. The Biden-Harris Administration has advanced the political and civic participation of women and girls as a pillar of democracy promotion efforts worldwide. The Administration launched Women LEAD, a $900 million public-private partnership focused on building the pipeline of women leaders around the world, including by supporting programs to reach girls and young women. Under this umbrella, the USAID-led Advancing Women’s and Girls’ Civic and Political Leadership Initiative provides more than $25 million to identify and dismantle the individual, structural, and socio-cultural barriers to the political empowerment of women and girls in ten focus countries: Côte d’Ivoire, Nigeria, Tanzania, Kenya, Colombia, Ecuador, Honduras, Kyrgyz Republic, Yemen, and Fiji. Furthermore, the State Department is launching a new $1.25 million program in Africa that will empower and equip young women leaders to take on decision-making roles in democratic transition processes.
    • Protecting Girls in Humanitarian Emergencies. The United States government has increased its support for girls in humanitarian and fragile contexts. Since 2021, USAID has more than doubled the percentage of its humanitarian budget allocated to the protection sector, which includes child protection and gender-based violence activities serving girls. In FY 2023, USAID provided $163 million specifically towards addressing gender-based violence in humanitarian emergencies. In 2022, USAID and the Department of State launched Safe from the Start: ReVisioned, which seeks to better address the needs of girls and women from the onset of a conflict or crisis.
    • Combatting Child Trafficking. To combat child trafficking, including trafficking of girls, the Department of State has committed $37.5 million through Child Protection Compacts, building capacity in Jamaica, Peru, and Mongolia, and establishing new partnerships with Colombia, Cote d’Ivoire, and Romania. These partnerships strengthen country responses to child trafficking to more effectively prosecute and convict traffickers, provide comprehensive trauma-informed care for child victims—including girls—and prevent child trafficking in all its forms.

    ###

    MIL OSI USA News –

    January 23, 2025
  • MIL-OSI USA News: FACT SHEET: Delivering on Our Commitments, 12th U.S.-ASEAN Summit in Vientiane, Lao  PDR

    Source: The White House

    The Biden-Harris Administration has worked to strengthen our ties with ASEAN and deliver on our commitments to the region. Over the past three and a half years, we have pursued an unprecedented expansion in the breadth and depth of U.S.-ASEAN relations, including upgrading our relationship to a Comprehensive Strategic Partnership and institutionalizing cooperation in five new areas—health, transportation, women’s empowerment, environment and climate, and energy—as well as deepening our cooperation in foreign affairs, economics, technology, and defense. To date, we have made significant progress in fulfilling 98.37 percent of our commitments in the ASEAN-U.S. Plan of Action (2022-2025) and its Annex. The United States will continue working with ASEAN, including through ASEAN-led mechanisms, to build an open, inclusive, transparent, resilient, and rules-based regional architecture in which ASEAN is its center.
     
    DELIVERING ON OUR COMPREHENSIVE STRATEGIC PARTNERSHIP

    This year, the United States and ASEAN are celebrating 47 years of U.S.-ASEAN relations. President Biden and Vice President Harris remain committed to ASEAN centrality and supporting the ASEAN Outlook on the Indo-Pacific, which shares fundamental principles with the U.S. Indo-Pacific Strategy. ASEAN is at the heart of the U.S. approach to the Indo-Pacific, as reflected in numerous U.S. initiatives to promote economic prosperity and regional stability. Through the U.S.-ASEAN Comprehensive Strategic Partnership, the United States has demonstrated that we are a reliable and enduring partner for our combined one billion people. Key U.S.-ASEAN accomplishments under the Comprehensive Strategic Partnership include:

    • The U.S. Agency for International Development (USAID) extended the U.S.-ASEAN Regional Development Cooperation Agreement to 2029 enabling the launch of the new five-year ASEAN USAID Partnership Program in March 2024. 
    • The United States plans to conduct a second U.S.-ASEAN maritime exercise in 2025, co-hosted by Indonesia. U.S. and ASEAN Member States’ navies will exercise communication, information sharing, and the implementation of maritime security protocols in accordance with international law.
    • In August 2024, the United States and ASEAN agreed to formalize U.S.-ASEAN health cooperation, elevating our engagement to a biennial U.S.-ASEAN Health Ministers Dialogue. USAID also officially launched the U.S.-ASEAN-Airborne Infection Defense Platform to bolster the region’s tuberculosis response capacity.
    • The United States is launching a cybersecurity training program for the ASEAN Secretariat that will enhance the cybersecurity awareness, knowledge, and skills of our partners who are the backbone of ASEAN institutions.  
    • At the third U.S.-ASEAN High-Level Dialogue on Environment and Climate this year, the United States unveiled the U.S.-ASEAN Climate Solutions Hub to help ASEAN members states develop and implement their contributions under the Paris Agreement.
    • In 2023, the United States and ASEAN held the inaugural Dialogue on the Rights of Persons with Disabilities to advance human rights for persons with disabilities across Southeast Asia, including working with private sector to find ways to support accessibility across Southeast Asia.

    As a reflection of the Comprehensive Strategic Partnership reaching its full potential, the United States and ASEAN celebrated the launch of the U.S.-ASEAN Center in Washington, DC in December 2023. The Center has already hosted several high-profile ASEAN-related events and is on track to become the key hub for ASEAN’s engagement with the United States.

    • In June 2024, the Center hosted the Secretary-General of ASEAN, Dr. Kao Kim Hourn, for his first working visit to the United States, where he launched a speaker series.
    • In August 2024, the Center hosted an ASEAN Day celebration, showcasing a wide array of cultural activities from ASEAN Member States.
    • The Center is also partnering with the Antiquities Coalition to host a Cultural Property Agreement workshop.

    The U.S.-ASEAN Smart Cities Partnership (USASCP) is a key mechanism for our engagement on innovating sustainable cities of the future. Since it was launched in 2018, USASCP has invested more than $19 million in over 20 projects across urban sectors throughout the region. USASCP tackles the varied challenges of rapid urbanization, including accelerating climate action and promoting sustainable urban services.

    • In 2024, the USASCP Smart Cities Business Innovation Fund 2.0 will grant $3 million for net-zero urban innovation projects to strengthen private sector investment in sustainability and climate action across the ASEAN region.
    • In 2022, the Smart Cities Business Innovation Fund 1.0 granted a total of $1 million to six awardees across the region, including a solar panel recycling facility in Da Nang Vietnam and a seaweed/bioplastics manufacturer in Tangerang Indonesia.
    • The United States paired municipal water and wastewater facility operators from five cities across the United States and the ASEAN Smart Cities Network to share their expertise.

    This year marks the Young Southeast Asian Leadership Initiative’s (YSEALI) second decade of building youth leadership capabilities across Southeast Asia to promote cross-border cooperation on regional and global challenges. YSEALI’s 160,000 strong digital network and 6,000 plus alumni community is creating new opportunities for its members to shape YSEALI’s next 10 years of impact. The State Department is well on its way to doubling the number of Southeast Asian youth participating in the YSEALI Academic and Professional Fellowships by 2025, in line with the commitments laid out by President Biden and Vice President Harris during the May 2022 U.S.-ASEAN Special Summit.

    • The United States has invested over $1.8 million to empower nearly 500 young women as part of the YSEALI Women’s Leadership Academy (WLA). In celebration of the WLA’s 10th anniversary, the U.S. Mission to ASEAN granted $44,000 to alumni groups to foster collaboration and find innovative ways to close the gender leadership gap.
    • The YSEALI Seeds for the Future Program—a grant program intended to support innovative initiatives in Southeast Asia—has provided nearly $3 million for more than 500 young leaders to carry out projects that improve their communities.
    • The Department of State’s YSEALI Alumni Engagement Innovation Fund supported 16 YSEALI alumni-led public service projects in 2024. 

    ENHANCING CONNECTIVITY AND RESILIENCE

    The Biden-Harris Administration continues to build greater connectivity with ASEAN and enhancing regional resilience to bolster economic development and integration. The United States is ASEAN’s number one source of foreign direct investment, and U.S. goods and services trade totaled an estimated $500 billion in 2023. Since 2002, the United States has provided more than $14.7 billion in economic, health, and security assistance to Southeast Asian allies and partners. During that same period, the United States provided nearly $1.9 billion in humanitarian assistance, including life-saving disaster assistance, emergency food aid, and support to refugees throughout the region. As a durable and reliable partner of ASEAN, the United States supports the governments and people of Southeast Asia in enhancing the region’s connectivity and resilience. In addition to U.S. companies’ substantial investments, the United States is cooperating with the private sector to equip the region’s workforce with the skills needed to succeed in Southeast Asia’s burgeoning digital economy. Other key U.S. initiatives supporting this effort include:

    • USAID announces $2 million of new funding to support the sustainable development of critical minerals, supporting ASEAN’s goal of raising environmental, social, and governance standards for mineral sector development. 
    • Through the Japan-U.S.-Mekong Power Partnership (JUMPP), the U.S. Department of State has implemented over 60 technical assistance activities to strengthen national power sectors and regional electricity market, enhancing the clean energy export potential of Cambodia, Lao PDR, Thailand, and Vietnam to the ASEAN market. 
    • The U.S. Trade and Development Agency is supporting a feasibility study to develop two cross-border interconnections, further expanding our longstanding support to connect the ASEAN Power Grid.
    • USAID is expanding cooperation with the ASEAN Center for Energy to support private sector and multilateral development bank investment to operationalize regional connectivity through the ASEAN Power Grid.
    • Through the ASEAN Digital Ministers’ Meeting and Digital Senior Officials’ Meeting, we are intensifying our cooperation on trusted information and communications technology infrastructure – including undersea cables, cloud computing, and wireless networks, artificial intelligence (AI), cybersecurity, and combatting online scams.
    • The United States supported development of the ASEAN Responsible AI Roadmap and provided AI technical assistance for the Digital Economy Framework Agreement. Our collective effort ensures ASEAN can foster an inclusive environment where affirmative, safe, secure, and trustworthy AI innovation can flourish.
    • Under the U.S.-ASEAN Connect framework, the U.S. Mission to ASEAN is leveraging the U.S. government and private sector expertise to advance economic engagement, including through workshops covering topics such as best practices to strengthen cybersecurity and how to harness digital technologies.

    Over the past three and a half years, the Biden-Harris Administration has also spurred investment and economic growth through the advancement of over $1.4 billion in private sector investments in the ASEAN region. This past year alone, the U.S. International Development Finance Corporation (DFC) has invested over $341 million in ASEAN markets. To further our cooperation and support, DFC has announced that it will open new offices in Vietnam and the Philippines to source more opportunities and further advance private sector investment. DFC’s key initiatives and investments have included:

    • Loaning up to $126 million loan to power company PT Medco Cahaya Geothermal to strengthen Indonesia’s energy security.
    • Initiating DFC’s first investment in Lao PDR with a $4 million loan portfolio guarantee to Phongsavanh Bank, which will work with Village Funds to give farmers financing to scale their businesses, increase their incomes, and improve their livelihoods.
    • Initiating DFC’s first investment in East Timor with a $3 million loan to microfinance institution Kaebauk Investimentu No Finansas, which will provide financing to small businesses, especially rural and unbanked ones.

    We look forward to continue advancing our Comprehensive Strategic Partnership with ASEAN in 2025 by formulating a new plan of action to guide the next five years of our enduring partnership as we work to further the prosperity of our combined one billion people.

    ###

    MIL OSI USA News –

    January 23, 2025
  • MIL-OSI Africa: PEUGEOT Completes its EV Line-up with the New PEUGEOT E-408: Unexpected from Every Angle, 100% Electric

    Source: Africa Press Organisation – English (2) – Report:

    CASABLANCA, Morocco, October 10, 2024/APO Group/ —

    PEUGEOT (www.PEUGEOT.com) completes its EV line-up, with a fully electric version of the PEUGEOT 408, following the launch of the plug-in hybrid version in 2022. The new PEUGEOT E-408 combines the unexpected allure of a fastback silhouette with zero emission efficiency, the thrill of a powerful 157 kW/210 hp motor, and the pleasure of the PEUGEOT electric driving experience, with up to 453 km range. When it comes to recharging, the process is made simple with the integrated trip planner. PEUGEOT also offers total peace of mind to its customers by providing the PEUGEOT E-408 with 8 years/160,000 km warranty through its ALLURE CARE programme.

    ALLURE: With its fastback silhouette and 100% electric powertrain, the PEUGEOT E-408 is an entirely unique offering in the market.

    EMOTION: The pleasure of 100% electric driving is amplified with the PEUGEOT i-Cockpit® and its embedded trip planner.

    EXCELLENCE: The PEUGEOT E-408 completes PEUGEOT’s EV line-up, the widest of any mainstream manufacturer in the European electric market with 12 electric passenger cars and LCVs.

    By unveiling the PEUGEOT 408 in June 2022, PEUGEOT brought the allure of an unprecedented fastback silhouette to the top of the C segment. Unexpectedly different, the 408 stands out with its feline posture, dynamic lines offering an elevated driving position, and the premium sophistication of its design down to the finest details.

    The two electrified powertrains, PLUG-IN HYBRID 180 e-EAT8 and PLUG-IN HYBRID 225 e-EAT8, marked a first step in electrification for the 408. Earlier this year, the 48V HYBRID 136 e-DCS6 joined the 408 line-up. The new PEUGEOT E-408 takes this electric strategy to the next level with a zero-emission powertrain of 157 kW/210 hp paired with a 58,2 kWh (usable) NMC battery.

    The launch of the PEUGEOT E-408, with the opening of orders from 2nd October, marks the latest step in PEUGEOT’s ambition to become the mainstream EV leader in Europe. The new PEUGEOT E-408 will be built at the Mulhouse plant and benefits from the ALLURE CARE programme and is warranted for up to 8 years / 160,000 km, the longest of any European brand.

    ALLURE: AN UNEXPECTED AND DYNAMIC FASTBACK DESIGN

    The innovative and unexpected fastback design perfectly matches the modernity of the new PEUGEOT E-408. A platform that allows for total electrification without compromising on style, dynamism, or interior comfort.

    With an overall length of 4.69m and a width of 1.85m (with the mirrors folded), the PEUGEOT E-408 uses the multi-energy E-EMP2 (Efficient Modular Platform), notable for its wheelbase length of 2.79 m. This generous dimension allows the battery to be installed in the car’s underbody, under the floor between the wheels, thus preserving the cabin space and lowering the PEUGEOT E-408’s centre of gravity for dynamic road behaviour where pleasure drives progress.

    This architecture combines the dynamic elegance of a fastback, road behaviour worthy of the best saloons, and a slightly elevated driving position that enhances daily enjoyment, safety, and comfort.

    A feline posture

    With its wide tracks – 1.59 m at the front and 1.60 m at the rear – the PEUGEOT E-408 is firmly anchored to the road. Despite being elevated, this model offers a sleek and sporty profile thanks to a limited height of 1.49 m, which improves aerodynamics.

    The feline character of the PEUGEOT E-408 is highlighted by the unique and sharp treatment of the body surfaces, particularly noticeable towards the rear – with the ‘cat’s ears’, the boot lid, and the shape of the wings, creating sharp facets designed to play with the light.

    Side body and wheel arch protections extend into a robust black rear bumper, which, by cutting the body colour diagonally, accentuates the rear’s dynamism. The large 19-inch Graphite wheels with innovative design receive 225-50R19 tyres with very low rolling resistance (A+ class).

    A modern identity

    The body-colour treatment of the PEUGEOT E-408’s grille “dematerialises” it by blending it into the bumper’s overall shape – a sign of a generational change and the electrification era of the PEUGEOT range.

    The brand’s identity is more visible than ever through the sophisticated work on lighting. At the front, the LED technology allows for very thin – and very effective – headlights that form the PEUGEOT E-408’s look: a resolutely PEUGEOT look. The light signature extends downward with two LED strips in the shape of fangs plunging into the bumper. At the rear, PEUGEOT’s identity takes the form of the iconic three LED claws, inclined for even more dynamism.

    Five colours are available for the new PEUGEOT E-408: Okenite White, Obsession Blue, Selenium Grey, Elixir Red and Perla Nera Black.

    EMOTION: MORE THAN EVER, PLEASURE DRIVES PROGRESS

    Generous power, immediate torque… the 100% electric drive of the PEUGEOT E-408 offers pure driving pleasure. This is further amplified by the PEUGEOT i-Cockpit® and road behaviour, in true PEUGEOT tradition.

    A unique driving experience

    The incomparable PEUGEOT i-Cockpit® offers exceptional ergonomics. The compact steering wheel enhances driving pleasure by allowing unique agility and precision of movement. Positioned at eye level just above the steering wheel, the digital cluster includes a fully customisable and configurable 10-inch 3D digital panel.

    More than ever, driving pleasure is embedded in the new PEUGEOT E-408’s genes, with exemplary road handling, high-end ride comfort, and perfect manoeuvrability in the city, enabled by a curb-to-curb turning radius of 11.18 m. To improve vibrational comfort, the body rigidity is optimized by bonding structural elements.

    Performance contributes to driving pleasure

    The new PEUGEOT E-408 features a synchronous electric motor with permanent magnets developing 157 kW (210 hp) and a generous torque of 345 Nm. This motor is produced in France, in Trémery, by the STELLANTIS-NIDEC joint venture. The reducer it is associated with is manufactured by STELLANTIS in Valenciennes (France).

    The PEUGEOT i-Cockpit® with countless connected services*

    The 10-inch high-definition central screen allows you to control the PEUGEOT i-Connect® Advanced system, which comes standard on the PEUGEOT E-408 and offers efficient and effective TomTom connected navigation. For optimal readability, the map display covers the entire 10-inch touchscreen. As for system updates, they are carried out “over the air,” meaning directly through data transmission via the telecom network.

    Efficient navigation with a trip planner and optimised solutions. The navigation system includes a “trip planner” function that optimally plans routes to maximise the car’s range and facilitate recharging. To calculate the ideal route, the system takes into account numerous pieces of information, including the distance to be travelled, the battery charge level at the start, the desired battery charge level at the destination, speed, energy consumption, traffic, type of road, elevation, and of course, available charging stations near the destination.

    The e-Routes by Free2move Charge application is also accessible in the vehicle by connecting a smartphone to the PEUGEOT i-Connect® system. It optimises all trips by calculating the best route based on the vehicle’s range needs, the location of charging stations, traffic conditions, the distance to be travelled, etc.

    The mirroring function that connects the smartphone to the car’s infotainment system is wireless (Apple CarPlay/Android Auto), and it is possible to connect two phones via Bluetooth simultaneously. Four USB-C ports complete the connected setup of the PEUGEOT E-408.

    The fully configurable i-toggles arranged under the central screen like an open book, provide a unique aesthetic and technology level in the segment. Each of the 5 customisable i-toggles offer a touch-sensitive shortcut to climate control settings, a phone contact, a radio station, an app launch… configured to the user’s choice. This can be customised for each driver, with up to 8 customisable profiles.

    A daily ally for more safety and ease, the “OK PEUGEOT” natural language voice recognition command allows access to all infotainment functions and ChatGPT. Like all the latest generation PEUGEOTs, the new PEUGEOT E-408 integrates the generative artificial intelligence ChatGPT, which responds, via voice command, to all requests, such as tourist information or generating a quiz to keep children occupied during a trip…

    The MyPEUGEOT® smartphone app is particularly practical and allows:

    • Launching or scheduling thermal preconditioning. Beyond comfort, this feature allows, when the vehicle is plugged in, to optimise range (faster convergence of the temperature setpoint during startup phases by anticipating the optimal operating temperature of the battery).
    • Consulting, scheduling, launching, or delaying battery charging.
    • Activating the welcome light sequence, for example, to locate the car in a crowded parking lot.

    A warm atmosphere inside the cabin

    The new PEUGEOT E-408 is designed as a high-end fastback in the C segment. It offers numerous features intended to fully enjoy the pleasure of travel and mobility.

    Inside the new PEUGEOT E-408, the LED ambient lighting (8 colours to choose from) behind the central screen, diffuses a soft light and contributes to the sophisticated cabin ambiance. The same

    light extends to the padded door panels, which are covered with either fabric, Alcantara® (RHD), or real stamped aluminum pieces (LHD), depending on the trim level.

    The thermal and acoustic comfort of the new PEUGEOT E-408 is optimised by the technologies implemented for the design and manufacture of its windows:

    • At the front and rear, the windows have a thickness (3.85 mm) above average.
    • At the front, the side windows are laminated (3.96 mm on GT) for better sound insulation and increased security.

    Of course, the air conditioning contributes to the thermal comfort of the occupants. The vents bringing fresh air into the cabin are positioned high at the front, and the rear passengers benefit from 2 air vents placed at the back of the central console.

    To ensure a healthy interior atmosphere, the PEUGEOT E-408 GT can be equipped with the optional AQS (Air Quality System), which continuously monitors the quality of the air entering the cabin and can automatically activate air recirculation. This serenity is complemented on the GT level by the Clean Cabin, an air treatment system with pollutant gas and particle filtration, with the air quality being displayed on the central touch screen.

    The new PEUGEOT E-408’s Hi-Fi Premium FOCAL® system is a result of over 3 years of co-design working with the high-end audio specialist. Complemented by ARKAMYS digital sound processing, the Hi-Fi Premium FOCAL® system consists of 10 speakers with exclusive patented technologies:

    • 4 TNF tweeters with inverted aluminum domes,
    • 4 woofers/midrange speakers with Polyglass membranes and TMD (Tuned Mass Damper) suspension of 165mm,
    • 1 Polyglass central channel,
    • 1 Power Flower™ triple coil oval subwoofer.
    • They are paired with a new 12-channel 690 W amplifier (boosted class D technology).

    Particularly enveloping, the front seats have obtained the AGR (Aktion für Gesunder Rücken) label awarded by an independent German association of ergonomics and back health experts. This label rewards both the ergonomics and the range of adjustments of the front seats. These can also have 10-way electric adjustments with two possible memory settings for the driver, 6  ways for the passenger, as well as 8-pocket pneumatic massage with 8 different programs, and heated seats.

    The seat design has been thought to highlight the quality of the materials used: mottled fabric, technical meshes, Alcantara, embossed leather, and nappa leather (for select markets). On the GT versions, they are adorned with an Adamite colour signature thread, which also outlines the dashboard, door panels, and padded console pads.

    Between the front seats, the central console’s arch extends to a space dedicated to wireless phone charging. Thus, the rest of the console is entirely dedicated to storage and practicality, with an armrest, 2 USB C ports (charge/data), 2 large-diameter cup holders, and up to 33 liters of various storage.

    The rear space is particularly generous, thanks to the long wheelbase of 2.79 m, making the new PEUGEOT E-408 the most spacious PEUGEOT for rear seated passengers: they benefit from 183 mm of leg room. The footwell, the space dedicated to the rear passengers’ feet under the first-row seats, is designed to maximise freedom of movement; the seat design and seating angle are

    intended to give passengers the opportunity to make the most of their space for optimal comfort during trips.

    Connectivity is not left behind with the presence, from the Allure level, of 2 USB C charging ports at the back of the central console.

    The new PEUGEOT E-408 offers a 2-part (60/40) bench seat with a ski hatch as standard. In the GT trim, it benefits from an immediate folding system of its 2 parts by operating two easily accessible controls from the trunk sides.

    The boot volume of the new PEUGEOT E-408 is particularly generous, offering 471 dm3  of loading capacity. With the rear seats folded, the space available is further increased to 1,545 dm3. Once the bench seatback is folded down, it is possible to load an object up to 1.89 m long. For daily practicality, the boot area is equipped with a 12V socket located on the right boot trim, LED lighting, a net and storage elastic, and bag hooks.

    EXCELLENCE: A CONSTANT QUEST FOR EFFICIENCY, SAFETY, AND QUALITY

    Efficiency was at the heart of the PEUGEOT teams’ concerns throughout the design and development of the PEUGEOT E-408.

    Designed for a smooth energy transition

    The aerodynamics of the new PEUGEOT E-408 (SCx: 0.66) received particular attention. Bumpers, front air intake, underbody screen, and lower rear guards for the the front wheels. The new PEUGEOT E-408 also receives a specific underbody forming an aerodynamic flat floor, the result is a low electricity consumption of 15.2 kWh / 100 km and up to 453 km WLTP combined range according to the WLTP cycle.**

    The PEUGEOT E-408 is equipped with a high-voltage battery of 58,2 kWh usable. With NMC 811 technology – 80% Nickel, 10% Manganese, 10% Cobalt – it benefits from increased energy density with 18 onboard modules. The new PEUGEOT E-408 offers a range of 453 km in the WLTP mixed cycle, meeting the needs of most C-segment customers, whose typical daily mileage is under 45 km (Industry data).

    Regenerative braking allows for a smoother driving experience. Using the paddles behind the compact steering wheel, the driver can easily activate regenerative braking in 3 levels, the left paddle increases regeneration, and the right one decreases it… The three regeneration levels are: Low (-0.6 m/s²) for sensations close to a thermal vehicle, Moderate (-1.3 m/s²) for increased deceleration when releasing the accelerator pedal and, Increased (-2.0 m/s²) for maximum deceleration when releasing the accelerator pedal and thus maximum regeneration. The last two levels automatically illuminate the rear stop lights.

    The driver can also choose between three drive modes, depending on their priorities. Normal is the default mode, setting the power at 140 kW (190 hp) and torque at 300 Nm, offering an ideal balance between dynamism and range. The Sport mode (157 kW/210 hp and 345 Nm) is available for maximum performance and activates automatically and temporarily during “kick downs.” The ECO mode (125 kW/170 hp, 270 Nm) favours range while preserving driving pleasure.

    The new PEUGEOT E-408 is equipped as standard with a heat pump, as well as heated steering wheel and seats, optimizing passenger thermal comfort while preserving battery energy. A simple and fast recharge. For AC charging, the new PEUGEOT E-408 is equipped as standard with an 11 kW three-phase charger. For DC charging via superchargers, the PEUGEOT E-408 accepts power up to 120 kW, allowing a charge from 20% to 80% of the battery in just over 30 minutes (under nominal battery temperature conditions) and recovering 100 km of range in just over 10 minutes. To optimise charging, the driver can program the lower and upper thresholds from the PEUGEOT E-408’s central screen. For example, from 20% minimum charge to 80% maximum charge.

    Something for everyone

    Two plug-in hybrid engines are also available on the PEUGEOT 408:

    PLUG-IN HYBRID 225 e-EAT8: 2-wheel drive / combination of a 180 bhp (132 kW) turbo engine and an 81 kW electric motor coupled with the e-EAT8 8-speed automatic gearbox / currently undergoing homologation.

    PLUG-IN HYBRID 180 e-EAT8: 2-wheel drive / combination of a 150 bhp turbo engine (110kW) and an 81kW electric motor coupled with the 8-speed e-EAT8 automatic gearbox / currently undergoing homologation.

    The Li-ion battery on both plug-in hybrid versions has a capacity of 12.4kWh. Two types of on-board chargers are available: a 3.7kW single-phase charger as standard and an optional 7.4kW single-phase charger.

    Estimated charging times are the following:

    • From a 7.4kW Wall Box (32 A) and with the 7.4kW single-phase on-board charger, fully charged in 1 hour 40 minutes.
    • From a reinforced socket (14 A) and with the 3.7kW single-phase on-board charger, fully charged in 3 hours 55 minutes.
    • From a standard socket (8A) and with the single-phase on-board charger (3.7kW), full charging takes approximately 7 hours 05 minutes.

    One hybrid engine is available on the PEUGEOT 408:

    HYBRID 136 e-DCS6: 2-wheel drive / combination of a 136 hp turbo engine (100kW) and a 48V battery coupled with the 6-speed e-DCS6 automatic gearbox.

    This PEUGEOT HYBRID 48V system, which consists of a new-generation 136 hp petrol engine coupled with a dual-clutch 6-speed gearbox that incorporates an electric motor. Thanks to a battery that recharges while driving, this technology offers extra torque at low revs and a reduction of up to 15% in fuel consumption (5.2 l/100 km in WLTP mixed cycle**). In urban driving, the new 408 Hybrid 136 e-DCS6 can operate up to 50% of the time in 100% electric zero-emission mode.

    Maximum safety for optimal peace of mind

    Onboard the new PEUGEOT E-408, a comprehensive set of latest-generation driving aids, powered by information gathered from 5 cameras and 3 radars, secure and ease driving, maneuvers, and travel. Some of these systems are directly derived from higher segments:

    • Adaptive cruise control with Stop and Go function and adjustable inter-vehicle distance setting.
    • Automatic emergency braking with collision risk alert: it detects pedestrians and cyclists, day and night, from 7 km/h to 140 km/h depending on the version.
    • Active lane departure warning with trajectory correction.
    • Driver attention alert detecting vigilance issues during long drives and at speeds above 65 km/h, using steering wheel micro-movement analysis.
    • Extended recognition and display on the digital cluster of traffic signs: stop, no entry, no overtaking, end of no overtaking, in addition to the usual speed-related signs.
    • Long-range blind spot monitoring (75 metres).
    • Rear traffic alert: during reverse, alerts of approaching danger nearby.

    A clear and straightforward range

    The new PEUGEOT E-408 is available in two trims: Allure and GT

    The new PEUGEOT E-408 is available in two versions: Allure and GT.

    The PEUGEOT E-408 Allure comes standard with: LED headlights, 19” alloy wheels, PEUGEOT i-Cockpit® with a customisable 10” digital instrument cluster, connected navigation with trip planner, OK PEUGEOT voice command, wireless mirroring Apple CarPlay/Android Auto, 6-speaker audio system, heated driver seat and steering wheel, dual-zone automatic climate control, rear parking camera and sensors, heat pump, etc.

    The PEUGEOT E-408 GT comes standard with, in addition to the Allure version’s equipment: Matrix LED headlights, front parking sensors, PEUGEOT i-Cockpit® with a customisable 10” digital instrument cluster, aluminum interior trims with customisable 8-colour ambient lighting, aluminum door sills, hands-free motorised tailgate, Drive Assist Plus package (Level 2 semi-autonomous driving), etc.

    Superior quality

    The new PEUGEOT E-408 is positioned at the top of the C segment, offering ergonomics, quality, finish, and equipment worthy of higher categories.

    As on all its 100% electric models, PEUGEOT will offer its PEUGEOT Allure Care program on the new PEUGEOT E-408, which covers the electric motor, charger, transmission, and main electrical and mechanical components for up to 8 years or 160,000 kilometers. PEUGEOT Allure Care complements the specific PEUGEOT warranty that already applies to the high-voltage battery for 8 years/160,000 km to provide comprehensive vehicle coverage. PEUGEOT Allure Care activates automatically and free of charge every 2 years or 25,000 kilometers after each maintenance performed within the PEUGEOT network.

    Owners of the PEUGEOT E-408 will benefit from reduced maintenance constraints, with a service program every 2 years or 25,000 kilometers.

    *Some services may require a subscription.

    ** WLTP cycle under approval 

    MIL OSI Africa –

    January 23, 2025
  • MIL-OSI Canada: Remarks by the Deputy Prime Minister announcing new actions to build secondary suites and unlock vacant lands to build more homes

    Source: Government of Canada News

    Today, I will tell you about the new measure our government is taking to build new housing. Minister Jean-Yves Duclos (Minister of Public Services and Procurement) will tell you about the latest additions to the Canada Public Land Bank, a very important program that continues. And after that, Minister Terry Beech (Minister of Citizens’ Services) will tell you about the impact of these measures for Canadians.

    October 8, 2024 – Ottawa, Ontario

    Check against delivery

    Introduction

    Good morning.

    I’m going to start on a very celebratory note. I want to start by congratulating the amazing Geoffrey Hinton on his Nobel Prize in physics. He is a great Canadian. He is absolutely brilliant. He happens to be a constituent of mine and, as the father of AI, is the teacher of generations of great Canadian intellectual leaders who have been taught by him, and who have learned from him at the University of Toronto. What a wonderful accomplishment. This is an honour which is richly deserved, and I think I speak for all Canadians in saying we are so proud of you and so grateful to you.

    Today, I will tell you about the new measure our government is taking to build new housing. Minister Jean-Yves Duclos (Minister of Public Services and Procurement) will tell you about the latest additions to the Canada Public Land Bank, a very important program that continues. And after that, Minister Terry Beech (Minister of Citizens’ Services) will tell you about the impact of these measures for Canadians.

    I do want to start by talking for a moment about the good economic news we’ve been having in recent weeks. Canada is leading the G7 in achieving a soft landing after the COVID recession. Inflation fell to 2 per cent in August. That is a 42-month low and it means that, for all of this year, inflation has been within the Bank of Canada’s target range.

    Thanks to that inflation trajectory, the Bank of Canada led the G7 in cutting rates. Canada was the first G7 country to cut interest rates for the first time, we were the first G7 country to cut interest rates for the second time, and we were the first G7 country to cut interest rates for the third time.

    Wages have been outpacing inflation for 19 months in a row now. What all of that means for Canadians is their paycheques are going further. And for people who own a home and have a mortgage that is coming up for renewal, the fact that interest rates are coming down is a source of really great relief.

    Now on our announcement. We are announcing today new rules about secondary suites, and we’re issuing technical guidance for lenders and insurers to offer refinancing for secondary suites. These will come into force on January 15th.

    The idea here is to make it easier for people to build a secondary suite in their home, for someone to build a basement flat, a garden flat, or laneway housing. This is all about gentle density, creating more homes for Canadians to live in. It builds on the secondary suite loan program, which was announced in Budget 2024.

    Specifically, we’re going to allow refinancing of insured mortgages to build a secondary suite in your home. You will be able to access up to 90 per cent of the home value, including the value added by the secondary suite, and you will be able to amortize your refinanced mortgage for up to 30 years. The limit for insured mortgages, if you are building a secondary suite, will be $2 million and that will be particularly important to recognize—and is a recognition of conditions in the GTA, and in the Lower Mainland.

    This is really about giving Canadians, Canadian homeowners the opportunity to be part of our great national effort to build more homes faster. It’s to let a family who already owns a home and maybe would like their grandmother or grandfather, or both of them, to move in with them to give them access to a little bit more money to build that basement flat, to build that garden suite, so that grandparents can move in.

    It’s also about grandparents who have a big house. Maybe they are alone in that house, and they’d like a grandchild to be able to move in with them to go to school. This is about making it easier for them to build that extra space. And we see this as a measure which goes alongside other measures that we’ve put in place—designed for the big builders to get more homes built faster, to get more rental units built. This is about saying regular Canadians should have the ability and access to the financing to build gentle density in their neighbourhoods. To build density that their families and their communities need.

    The second announcement is a consultation on taxation of vacant land. We believe that good land should not be left unused. Ireland, for example, has a measure like that. Today, we are announcing consultations with municipalities, provinces and territories to discuss whether we need such a measure here in Canada.  And the objective, like all our objectives concerning housing, is to build more housing faster. We know that Canada needs this.   

    We know that one of the most pressing issues for Canada, for Canadians, is housing. And we know that the centre of that issue, the centre of the solution, needs to be to get more homes built faster. Today’s announcements are another arrow in our quiver of measures to get more homes built faster in Canada. This is about getting 4 million homes built.

    I’m now going to turn it over to my colleague, Jean-Yves Duclos.

    MIL OSI Canada News –

    January 23, 2025
  • MIL-OSI Security: Halifax Regional Municipality — RCMP warning of cryptocurrency investment scam

    Source: Royal Canadian Mounted Police

    RCMP Halifax Regional Detachment is warning the public about a cryptocurrency investment scam reported in the Halifax Regional Municipality.

    On October 2, RCMP officers learned that a man interacted with a woman through social media platforms and was convinced to invest into a cryptocurrency app. The victim was defrauded of more than $400,000.

    With the introduction of cryptocurrencies, these scams are becoming more common and harder to detect.

    RCMP officers stress the importance of due diligence when considering investment opportunities. You can protect yourself by following these tips:

    • Be cautious: Be wary of anyone offering high-reward, low-risk investment opportunities. If it’s too good to be true, it’s probably a scam.
    • Do your research: Take the time to investigate an investment opportunity. Anyone who trades or advises on securities in Nova Scotia must be registered with the Nova Scotia Securities Commission (NSSC). If someone isn’t registered with the NSSC or another Canadian securities regulator, it’s likely they’re a scammer.
    • Get advice: Remember that cryptocurrencies are currently unregulated in Canada and don’t have the same fraud protection as credit cards, nor are they covered by the Canada Deposit Insurance Corp. Always use well-known and reputable exchanges to purchase cryptocurrency. When in doubt, seek advice from a reputable financial institution.

    If you or someone you know is a victim of investment fraud, report it to your local police and the Canadian Anti-Fraud Centre.

    File #: 24-135214

    MIL Security OSI –

    January 23, 2025
  • MIL-OSI USA: Baldwin Introduces Bipartisan Legislation to Stop Federally Funded School Buses from Being Manufactured in China

    US Senate News:

    Source: United States Senator for Wisconsin Tammy Baldwin
    WASHINGTON, D.C. – U.S. Senator Tammy Baldwin (D-WI) joined a group of bipartisan colleagues to introduce the Secure School Buses Act, legislation to ensure school bus manufacturers tied to foreign entities and countries of concern, including the Chinese Communist Party (CCP), do not receive federal funding.
    “When we use taxpayer dollars, we should be investing those dollars back into American businesses, workers, and communities – not sending money overseas to adversaries like China,” said Senator Baldwin. “I’m proud to work with my Democratic and Republican colleagues to ensure taxpayer investments in our children’s school buses won’t line the pockets of bad actors like China and give them a competitive edge over our workers and businesses.”
    Several years ago, the Environmental Protection Agency (EPA) established the Clean School Bus Program to replace existing school buses with cleaner alternatives.  According to the EPA, they have awarded almost $3 billion in taxpayer funds through this program. Troublingly, certain companies in the electric bus industry have ties to the CCP and other foreign entities of concern. While federal funds are prohibited from going to companies with ties to the CCP and other foreign entities of concern for public transit, there are no such prohibitions for the procurement of school buses. The Secure School Buses Act would prohibit the award of federal grant funding to school bus manufacturers with certain ties to a foreign entity of concern.
    Senator Baldwin has long pushed to close loopholes that allow federal funding to be used for purchasing and manufacturing equipment overseas, including her bipartisan Buy America for Small Shipyard Grants, SAFE TRAINS Act, and Made in America Act, which were signed into law.   
    The Secure School Buses Act is led by Senator Marsha Blackburn (R-TN) and also co-sponsored by Senators Mark Kelly (D-AZ), and John Cornyn (R-TX). The bill is endorsed by the Alliance for American Manufacturing and Heritage Action.
    Click here for bill text.

    MIL OSI USA News –

    January 23, 2025
  • MIL-OSI Economics: LLMs are becoming a commodity—Now what?

    Source: Microsoft

    Headline: LLMs are becoming a commodity—Now what?

    Whenever a compelling new AI model emerges, I like to put it through its paces. Recently, I’ve been experimenting with the preview of OpenAI o1 (formerly known as Strawberry), an astonishing new LLM that’s capable of solving complex and layered problems, especially in math, science, and coding. 

    For businesses, o1 model and a slew of others in the works represent a clear opportunity. But they also reflect a less obvious challenge: as LLMs become more sophisticated, they’ll also become quickly commoditized, with not a lot of differentiation between them.  

    In other words, today’s breakthroughs will become tomorrow’s table stakes. This means companies should focus more on how they integrate these models with their own data and workflows, rather than seeing the models themselves as a unique competitive advantage. Embracing this shift in mindset is the way to ensure your business stays ahead.  

    Decoding the latest advance 
    We have historically relied on size to improve the capabilities of LLMs—training them on more and more data, a process that is incredibly time- and resource-intensive.    

    OpenAI o1 introduces an entirely new scaling dimension, one in which a model can become significantly more capable by taking more time to “think” or reason before it responds. That means o1 can tackle problems step by step, much like how a human might approach challenging questions.  

    Ethan Mollick, professor at the Wharton School at the University of Pennsylvania, tried the o1 preview on a tough segment of a crossword puzzle and it performed quite well (though not flawlessly). Crossword puzzles trip up other LLMs because they can’t perform the iterative thinking that’s required: trying a word, scratching it out when it doesn’t fit, and cross-referencing clues to see how answers might fit together. 

    People across the business world are already experimenting with how o1 can handle tasks like responding to RFPs or performing risk assessments. It’s clear that we’ll look back and consider o1 to be one of the most pivotal advancements in generative AI. 

    So if o1 is such a breakthrough, why am I arguing that models will be commoditized? It comes down to competition. With so much energy and opportunity in the AI space, model developers are racing to exceed one another’s advances. We can expect to see more models, from more providers, with more capabilities on par with one another. 

    Technology and commoditization 
    Think of another technology that was groundbreaking for its time: the television. Once a rare luxury made by only a few companies, TVs are now produced by many manufacturers, with excellent models widely available. About two decades ago, flat-screen TVs were coveted and expensive. Now it can cost as much to mount a TV on the wall as it does to buy the TV itself, and “flat-screen TV” has become a redundant phrase. We expect LLMs to follow a similar path to commoditization, but at a swifter pace.  

    What does this mean for businesses? Leaders have to look beyond the LLMs themselves and focus on creating a system around the models that will serve the unique needs of their organizations. Only by understanding AI systems more holistically will they be able to leverage them to innovate, create value, and maintain a competitive edge.  

    Unlocking the real value of AI for business 
    LLMs get a lot of attention in the media, but the real value of AI comes from how you steer, ground, and fine-tune these models with your business data and workflow. And those capabilities come from the full system that surrounds the LLM. 

    Consider the evolution of personal computers. At first the raw power of the CPU was the most critical factor. But as powerful CPUs became commodities, the value of the PC shifted to the overall system—the combination of hardware and software that met your needs. Today, we don’t judge a PC by the power of a single component; it’s the value of the entire package that differentiates one device from another. 

    The same goes for AI: the system is more powerful than any one part. An LLM on its own, no matter how impressive, won’t deliver truly valuable results until it’s grounded in your company’s specific knowledge. When a system like Copilot can draw from your work data—emails, files, meetings, etc.—it becomes much smarter about your business. The system performs better when you can steer it toward your goals and fine-tune it to adapt to your specific needs. Together, all these elements feed the advanced “thinking” that the LLMs can and will be doing. 

    Think about how this system would work for, say, a retailer. An LLM on its own can offer general ideas for training new employees for the sales floor. But AI is more powerful if it also knows the specifics of your business. A highly effective AI agent might create and deliver training modules for your new retail employees, with insight into your latest products, up-to-the-minute promotions, and specialized customer service techniques. 

    Summing it up  
    LLMs are making incredible progress, and I’m delighted every day by what they can accomplish. But their true potential comes through when they’re applied to your unique business data and workflows. That way, they’ll solve more than puzzles—they’ll help untangle your thorniest business problems and reveal new opportunities for creating value.

    MIL OSI Economics –

    January 23, 2025
  • MIL-OSI USA: Final 2023 Annual Electric Sales and Revenue Data

    Source: US Energy Information Administration

    Form EIA-861, Annual Electric Power Industry Report, and Form EIA-861S (the shortform) collect data from distribution utilities and power marketers of electricity. This survey is a census of all United States electric utilities. The short form is intended for smaller bundled-service utilities and has less detailed responses. This survey collects more data than the monthly counterpart, Form EIA-861M. Data are the individual surveys responses and are included in the files described below.

    Our survey page contains the current survey form, instructions, respondent portal, and frequently asked questions. Data from these files can be found throughout our publications, usually in aggregated form in our Electric Power Annual (EPA) report; State Electricity Profiles (SEP); Electric Sales, Revenue, and Average Price (ESR) report; Electricity Data Browser; and in some Today in Energy articles.

    Please refer to our Guide to EIA Electric Power Data and send any questions to InfoElectric@eia.gov.

    In 2012, we created Form EIA-861S to reduce respondent burden and to increase our processing efficiency; that year, about 1,100 utilities initially reported on this form instead of Form EIA-861. In 2020, the number of utilities increased to about 1,700 utilities. We reformatted the files for the years 1990–2011, but we didn’t change or update any data files. We reformatted the files to make them easier to understand and to match the format and titles of the current files.

    • Frame
      • Surveys: Form EIA-861 and Form EIA-861S
      • Time frame: 2016 to present
      • Description: The data contain a complete list of all respondents from both forms and which files they have data in.
    • Advanced Metering
      • Surveys: Form EIA-861 and Form EIA-861S
      • Time frame: 2007 to present
      • Description: The data contain number of meters from automated meter readings (AMR) and advanced metering infrastructure (AMI) by state, sector, and balancing authority. The energy served (in megawatthours) for AMI systems is provided. Form EIA-861 respondents also report the number of standard meters (non AMR/AMI) in their system.
      • Historical Changes: We started collecting the number of standard meters in 2013. The monthly survey collected these data from January 2011 to January 2017.
    • Balancing Authority
      • Surveys: Form EIA-861 and Form EIA-861
      • Time frame: 2012 to present
      • Description: The data contain the list of balancing authorities and the states they operate in.
    • Delivery Companies
      • Survey: Form EIA-861
      • Time frame: 2020 to present
      • Description: The data contain revenue, sales, and customer count by sector from utilities that deliver energy in Texas.
    • Demand Response
      • Survey: Form EIA-861
      • Time frame: 2013 to present
      • Description: The data contain energy demand response programs by state, sector, and balancing authority. We collect data for the number of customers enrolled, energy savings, potential and actual peak savings, and associated costs.
    • Distribution Systems
      • Survey: Form EIA-861
      • Time frame: 2013 to present
      • Description: The data contain the number of distribution circuits and circuits with voltage optimization by state.
    • Dynamic Pricing
      • Survey: Form EIA-861
      • Time frame: 2013 to present
      • Description: The data contain the number of customers enrolled in dynamic pricing programs by state, sector, and balancing authority. Respondents check if one or more customers are enrolled in time-of-use pricing, real time pricing, variable peak pricing, critical peak pricing, and critical peak rebates.
    • Energy Efficiency
      • Survey: Form EIA-861
      • Time frame: 2013 to present
      • Description: The data contain incremental energy savings, peak demand savings, weighted average life cycle, and associated costs for the reporting year and life cycle of energy efficiency programs.
    • Mergers
      • Survey: Form EIA-861
      • Time frame: 2007 to present
      • Description: The data contain information on mergers and acquisitions.
    • Net Metering
      • Survey: Form EIA-861
      • Time frame: 2001 to present
      • Description: The data contain cumulative installation count and capacity of generators that are net metered by technology, state, sector, and balancing authority. If available, the energy sold back to the grid is also reported. Technology types include photovoltaic (standard, virtual less than 1 megawatt, and virtual 1 megawatt or greater), wind, and other. Storage systems that are paired with net-metered photovoltaic (PV) are also captured. We make a state-level adjustment for missing PV capacity and to convert state total capacity to AC units for those respondents who report data in DC units; we use 0.8256 as a conversion factor to change DC to AC. For other energy sources, we have not established imputation procedures.
      • Historical Changes: Initially, data contained only the customer count. In 2007, energy displaced was added (later renamed to energy sold back). We added capacity of systems in 2010, and we divided this category by technology type: PV, wind, and other. In 2016, we added a question to the survey about whether the megawatts reported for the PV systems were in AC or DC units). Also in 2016, the survey divided PV to include virtual systems and storage systems paired with PV. Starting in 2020, Form EIA-861S respondents were imputed.
    • Non-Net Metering Distributed
      • Survey: Form EIA-861
      • Time frame: 2010 to present
      • Description: The data contain cumulative values of generators that are not net metered and are under 1 megawatt in size (and not reported on Form EIA-860). Installations, total capacity, capacity owned, and capacity backup are reported in aggregate by state, sector, and balancing authority. Capacity is also reported by technology, state, sector, and balancing authority. Technology types include combustion turbine, internal combustion engine, fuel cells, hydroelectric, photovoltaic (PV), steam turbine, storage, wind, and other. Form EIA-861S respondents do not provide non-net-metering distributed data. A state-level adjustment is made for missing PV capacity and to convert state total capacity to AC units for those respondents who report data in DC units; we use 0.8256 as a conversion factor to change DC to AC, which uses the responses from the net-metering schedule. For other energy sources, we have not established imputation procedures.
      • Historical Changes: This schedule was referred to as distributed generation, and we renamed it to prevent double counting from net-metered systems (2016). Data on dispersed systems (systems not connected to the grid) were collected up to 2015. In 2016, we added data on fuel cells. Starting in 2016, these data were broken out by sector, and an adjustment to convert state total capacity to AC units for those respondents who report data in DC units; we use 0.8256 as a conversion factor to change DC to AC. Starting in 2020, Form EIA-861S respondents were estimated.
    • Operational Data
      • Survey: Form EIA-861
      • Time frame: 1990 to present
      • Description: The data contain aggregate operational data for the source and disposition of energy and revenue information from each electric utility.
    • Reliability
      • Survey: Form EIA-861
      • Time frame: 2013 to present
      • Description:The data contain information on non-momentary electrical interruptions. If collected, utilities report the system average interruption duration index (SAIDI), the system average interruption frequency index (SAIFI), and the conditions under which these metrics are collected. We allow respondents to use IEEE standards or any other method. We created a short video to describe what is collected.
    • Sales to Ultimate Customers
      • Surveys: Form EIA-861 and Form EIA-861S
      • Time frame: 1990 to present
      • Description: The data contain revenue, sales (in megawatthours), and customer count of electricity delivered to end-use customers by state, sector, and balancing authority. A state, service type, and balancing authority-level adjustment is made for non-respondents and for customer-sited respondents.
      • Historical Changes: In 2003, we created the transportation sector and removed the other sector. We made this change to separate the transportation sales and reassign the other activities to the commercial and industrial sectors as appropriate. Non-transportation customers previously reported under other, including street and highway lighting, are now included in the commercial sector. Previously, we referred to this file as retail sales.
    • Sales to Ultimate Customers, Customer-Sited
      • Time frame: 2002 to present
      • Description: The data contain revenue, sales (in megawatthours), and customer count of electricity delivered to end-use customers by state, sector, and balancing authority. These data aren’t collected on Form EIA-861; however, they are included in the state adjustments totals in the sales to ultimate customers file.
    • Service Territory
      • Surveys: Form EIA-861 and Form EIA-861S
      • Time frame: 2001 to present
      • Description: The data contain names of counties and states in which the utility has equipment to distribute electricity to ultimate customers.
    • Short Form
      • Surveys: Form EIA-861 and Form EIA-861S
      • Time frame: 2001 to present
      • Description: The data contain revenue, sales (in megawatthours), and customer count of electricity delivered to end-use customers, by state and balancing authority. Respondents answer whether they have net metering, demand side management, and time-based programs.
    • Utility Data
      • Survey: Form EIA-861
      • Time frame: 1990 to present
      • Description:The data contain information on a utility’s North American Electric Reliability (NERC) regions of operation. The data also indicate a utility’s independent system operator (ISO) or regional transmission organization (RTO) and whether that utility is engaged in any of the following activities: generation, transmission, buying transmission, distribution, buying distribution, wholesale marketing, retail marketing, bundled service, or operating alternative-fueled vehicles.
      • Historical Changes: In 2010, we added the independent system operator (ISO) and regional transmission organization (RTO) regions.
    • Demand-Side Management (DSM)
      • Survey: Form EIA-861
      • Time frame: 2001 to 2012
      • Description: The data contain energy efficiency incremental data, energy efficiency annual data, load management incremental data, load management annual data, annual costs, and the customer counts of price response and time response programs by sector.
      • Historical Changes: In 2007, we added the customer counts of price response and time response programs.
    • Green Pricing
      • Survey: Form EIA-861
      • Time frame: 2001 to 2012
      • Description: The data contain revenue, sales, and customer count by sector and state.
      • Historical Changes: Initially, data contained only the customer count. In 2007, revenue and sales were added.

    MIL OSI USA News –

    January 23, 2025
  • MIL-OSI: Innventure Sponsors Licensing Executives Society (LES) Annual Meeting 2024

    Source: GlobeNewswire (MIL-OSI)

    ORLANDO, Fla., Oct. 10, 2024 (GLOBE NEWSWIRE) — Innventure (Nasdaq: INV), a technology commercialization platform, today announced its sponsorship of the Licensing Executives Society (USA & Canada), Inc. (LES) 2024 Annual Meeting. Innventure’s Gold Level sponsorship and attendance at the October 20-23 event in New Orleans underscores its commitment to fostering innovation and bringing groundbreaking technologies to market.

    “The LES Annual Meeting is like no other IP event you’ll go to,” said Bob Held, IP & Licensing Expert, Past President & Chair of the Board and part-time CEO of LES. “It brings together people from all walks of the IP life – students, university professors, CEOs of corporations, startups, mid-size companies, university tech transfer offices, government officials, attorneys, and consultants. It’s a forum where relationships are built that can last for decades, benefiting attendees both personally and professionally.”

    The LES Annual Meeting is a cornerstone event for professionals in intellectual property, licensing, and technology transfer. This year’s meeting is expected to draw over 500 attendees from around the world.

    The four-day event will feature over 30 panel sessions, 15 roundtable discussions, and distinguished keynote speakers, including Alaina van Horn, Chief of the Intellectual Property Enforcement (IPE) Branch of U.S. Customs and Border Protection, and Congressman Troy Carter. Topics will range from artificial intelligence and life sciences to data use in complex SEP licensing and recent legal updates across the U.S. and Europe.

    Innventure’s Gold Level sponsorship underscores its commitment to fostering innovation and bringing groundbreaking technologies to market.

    “We see tremendous value in supporting LES and its mission,” said Bill Haskell, CEO of Innventure. “Our model of commercializing breakthrough technologies aligns perfectly with the LES community’s focus on advancing the business of intellectual property.”

    At the LES Annual Meeting, Innventure will lead a workshop titled “Maximizing IP Value through Strategic Spin-Outs and Alternative Commercialization Approaches” that is scheduled for October 22, from 11:15 a.m. to 12:15 p.m. in Galerie 4 (2nd Floor). This panel will feature Innventure executives Gayle Anderson and Tom Cripe, alongside David Rikkers of Expedited Climb Capital LLC. This interactive session, conducted in a talk show format with Q&A, is designed for seasoned IP executives and professionals seeking to understand the nuances of technology transfer and spin-outs.

    The LES Annual Meeting provides unparalleled networking opportunities and insights into the evolving landscape of IP and technology commercialization.

    Representatives from Innventure will be on-site at the New Orleans Marriott, and available for meetings. Please visit them at the Innventure booth or reach out to Erin Steigerwalt, Innventure events manager.

    “In today’s rapidly changing IP environment, staying informed and connected is crucial,” Held said. “Whether it’s understanding the impact of generative AI on patents or keeping up with judicial rulings, LES offers the knowledge and connections needed to navigate these challenges effectively.”

    For more information about the LES Annual Meeting 2024 and to register, visit https://les2024.org/ or Innventure.com.

    About Innventure
    Innventure founds, funds, and operates companies with a focus on transformative, sustainable technology solutions acquired or licensed from multinational corporations. As owner-operators, Innventure takes what it believes to be breakthrough technologies from early evaluation to scaled commercialization utilizing an approach designed to help mitigate risk as it builds disruptive companies it believes have the potential to achieve a target enterprise value of at least $1 billion. Innventure defines ‘‘disruptive’’ as innovations that have the ability to significantly change the way businesses, industries, markets and/or consumers operate.

    About LES
    Established in 1965, the Licensing Executives Society (U.S.A. and Canada), Inc. (LES) is the largest member society of the Licensing Executives Society International, Inc. (LESI). LES has over 1,600 members and LESI has over 6500 members engaged in the creation, commercial development, and orderly transfer of intellectual property rights; protection and management of intellectual capital; and intellectual capital management standards development.

    Events Manager Contact

    Events Manager Contact: Erin Steigerwalt, Innventure
    esteigerwalt@innventure.com

    Media Contact: Laurie Steinberg, Solebury Strategic Communications
    press@innventure.com

    Investor Relations Contact: Sloan Bohlen, Solebury Strategic Communications
    investorrelations@innventure.com 

    The MIL Network –

    January 23, 2025
  • MIL-OSI: Varonis Announces Date of Third Quarter 2024 Financial Results

    Source: GlobeNewswire (MIL-OSI)

    NEW YORK, Oct. 10, 2024 (GLOBE NEWSWIRE) — Varonis Systems, Inc. (Nasdaq: VRNS), a leader in data security, announced that it will report its third quarter 2024 financial results following the close of the U.S. financial markets Tuesday, October 29, 2024.

    In conjunction with this announcement, Varonis will host a conference call Tuesday, October 29, 2024, at 4:30 p.m. ET to discuss the company’s financial results.

    To access this call, dial 877-425-9470 (domestic) or 201-389-0878 (international). The conference ID number is 13749435. A replay of this conference call will be available through November 5, 2024, at 844-512-2921 (domestic) or 412-317-6671 (international). The replay passcode is 13749435.

    A live webcast of this conference call will be available on the “Investor Relations” page of the company’s website (https://ir.varonis.com), and the replay will be archived on the website for one year.

    Additional Resources

    About Varonis

    Varonis (Nasdaq: VRNS) is a leader in data security, fighting a different battle than conventional cybersecurity companies. Our cloud-native Data Security Platform continuously discovers and classifies critical data, removes exposures, and detects advanced threats with AI-powered automation.

    Thousands of organizations worldwide trust Varonis to defend their data wherever it lives — across SaaS, IaaS, and hybrid cloud environments. Customers use Varonis to automate a wide range of security outcomes, including data security posture management (DSPM), data classification, data access governance (DAG), data detection and response (DDR), data loss prevention (DLP), and insider risk management.

    Varonis protects data first, not last. Learn more at http://www.varonis.com.

    Investor Relations Contact:
    Tim Perz
    Varonis Systems, Inc.
    646-640-2112
    investors@varonis.com

    News Media Contact:
    Rachel Hunt
    Varonis Systems, Inc.
    877-292-8767 (ext. 1598)
    pr@varonis.com

    The MIL Network –

    January 23, 2025
  • MIL-OSI: SuRo Capital Corp. Third Quarter 2024 Preliminary Investment Portfolio Update

    Source: GlobeNewswire (MIL-OSI)

    Continues to Execute on AI Strategy with Significant New Investments

    Net Asset Value Anticipated to be $6.50 to $7.00 Per Share

    NEW YORK, Oct. 10, 2024 (GLOBE NEWSWIRE) — SuRo Capital Corp. (“SuRo Capital”, the “Company”, “we”, “us”, and “our”) (Nasdaq: SSSS) today provided the following preliminary update on its investment portfolio for the third quarter ended September 30, 2024.

    “For over a decade, SuRo Capital has been the public’s gateway to curated venture capital. This access, once reserved only for venture capitalists, has provided exposure to some of the largest, most compelling, and highly sought after private companies in the world before they become publicly traded. Our current portfolio offers exposure to the infrastructure for artificial intelligence, growing consumer brands, and exciting consumer and enterprise software names, among others,” said Mark Klein, Chairman and Chief Executive Officer of SuRo Capital.

    Mr. Klein continued, “This year has been one of the most active investment periods for SuRo Capital in the last decade. During the quarter, we made a $17.5 million investment in OpenAI (via ARK Type One Deep Ventures Fund LLC), one of the largest artificial intelligence developers in the world, and increased our position in CoreWeave, an AI cloud computing provider, via a $5.0 million secondary transaction. Subsequent to quarter-end, we made a $12.0 million investment in VAST Data (via IH10, LLC), an AI infrastructure data platform focused on providing enhanced productivity and simple data management for the AI-powered world, and increased our investment in CoreWeave with an additional $5.0 million secondary,” said Mark Klein, Chairman and Chief Executive Officer of SuRo Capital.

    “With these new investments and our existing investment in CW Opportunity 2 LP we have invested nearly $55.0 million into some of the leading AI infrastructure companies. Given AI’s significant addressable market, we believe dedicating a significant portion of our portfolio to AI infrastructure will prove to be successful for our shareholders,” Mr. Klein continued.

    “Finally, during the quarter, our Board of Directors approved a repurchase program of up to $35.0 million for our 6.00% Notes due 2026 and the issuance of up to $75.0 million of private 6.50% Convertible Notes due 2029, with an initial issuance of up to $25.0 million. We believe the refinancing of a portion of our current debt to a longer-dated convertible instrument with favorable terms strengthens our balance sheet, provides greater flexibility to invest capital beyond 2026, and will ultimately maximize shareholder value in the long term,” concluded Mr. Klein.

    As previously reported, SuRo Capital’s net assets totaled approximately $162.3 million, or $6.94 per share, at June 30, 2024, and approximately $212.0 million, or $8.41 per share, at September 30, 2023. As of September 30, 2024, SuRo Capital’s net asset value is estimated to be between $6.50 to $7.00 per share, based on presently available information.

    Investment Portfolio Update
    As of September 30, 2024, SuRo Capital held positions in 36 portfolio companies – 32 privately held and 4 publicly held, some of which may be subject to certain lock-up provisions.

    During the three months ended September 30, 2024, SuRo Capital made the following investments:

    Portfolio Company Investment Transaction Date Amount(1)
    OpenAI Global, LLC –
    ARK Type One Deep Ventures Fund LLC(2)
    Convertible Equity via
    Class A Interest
    9/25/2024 $17.5 million
    CoreWeave, Inc. Common Shares 9/26/2024 $5.0 million

    __________________
    (1)   Amount invested does not include any capitalized costs or prepaid management fees or fund expenses, if applicable.
    (2)   SuRo Capital is invested in the Convertible Equity of OpenAI Global, LLC through its investment in the Class A Interest of ARK Type One Deep Ventures Fund LLC.   ARK Type One Deep Ventures Fund LLC’s sole portfolio asset for Class A Interest holders is the Convertible Equity of OpenAI Global, LLC.

    During the three months ended September 30, 2024, SuRo Capital exited or received proceeds from the following investments:

    Portfolio Company Transaction
    Date
    Quantity Average Net
    Share Price
    (1)
    Net
    Proceeds
    Realized
    Gain/(Loss)
    Churchill Sponsor VII LLC 8/18/2024 N/A N/A $- $(0.3 million)
    OneValley, Inc. (f/k/a NestGSV, Inc.)(2) 8/29/2024 N/A N/A $3.0 million $(6.6 million)
    PSQ Holdings, Inc. (d/b/a PublicSquare) – Public Common Shares(3) Various 359,845 $2.82 $1.0 million $0.7 million
    SPBRX, INC. (f/k/a GSV Sustainability Partners, Inc.)(4) 9/30/2024 N/A N/A $0.4 million $(6.8 million)
    YouBet Technology, Inc. (d/b/a FanPower)(5) 8/22/2024 N/A N/A $- $(0.8 million)

    __________________
    (1)   The average net share price is the net share price realized after deducting all commissions and fees on the sale(s), if applicable.
    (2)   On August 29, 2024, SuRo Capital sold its remaining position in OneValley, Inc. (f/k/a NestGSV, Inc.).

    (3)   As of September 30, 2024, SuRo Capital held 1,616,187 remaining PSQ Holdings, Inc. (d/b/a PublicSquare) public common shares.
    (4)   On September 20, 2024, SPBRX, INC. (f/k/a GSV Sustainability Partners, Inc.) dissolved its business and made a final distribution.(5)   Investment made through SuRo Capital Sports, LLC.

    Subsequent to quarter-end through October 10, 2024, SuRo Capital made the following investments:

    Portfolio Company Investment Transaction Date Amount(1)
    CoreWeave, Inc. Series A Preferred 10/8/2024 $5.0 million
    VAST Data, Ltd. – IH10, LLC(2) Series B Preferred via
    Membership Interest
    10/9/2024 $12.0 million

    __________________
    (1)   Amount invested does not include any capitalized costs or prepaid management fees or fund expenses, if applicable.
    (2)   SuRo Capital is invested in the Series B Preferred Shares of VAST Data, Ltd. through its investment in the Membership Interest of IH10, LLC. IH10, LLC’s sole portfolio asset is interest in the Series B Preferred Shares of VAST Data, Ltd. through a special purpose vehicle.

    SuRo Capital’s liquid assets were approximately $39.5 million as of September 30, 2024, consisting of cash and securities of publicly traded portfolio companies not subject to lock-up restrictions at quarter-end.

    As of September 30, 2024, there were 23,378,002 shares of the Company’s common stock outstanding.

    Convertible Note Purchase Agreement
    On August 6, 2024, SuRo Capital entered into a Note Purchase Agreement (the “Note Purchase Agreement”), by and between the Company and the purchaser identified therein (the “Purchaser”), pursuant to which we may issue up to a maximum of $75.0 million in aggregate principal amount of 6.50% Convertible Notes due 2029 (the “Convertible Notes”). Pursuant to the Note Purchase Agreement, on August 14, 2024 we issued and sold, and the Purchaser purchased, $25.0 million in aggregate principal amount of the Convertible Notes (the “Initial Notes”). Under the Note Purchase Agreement, upon mutual agreement between the Company and the Purchaser, we may issue additional Convertible Notes for sale in subsequent offerings to the Purchaser (the “Additional Notes”), or issue additional notes with modified pricing terms (the “New Notes”), in the aggregate for both the Additional Notes and the New Notes, up to a maximum of $50.0 million in one or more private offerings.

    Interest on the Convertible Notes will be paid quarterly in arrears on March 30, June 30, September 30, and December 30, at a rate of 6.50% per year, beginning September 30, 2024. The Convertible Notes will mature on August 14, 2029, and may be redeemed in whole or in part at any time or from time to time at our option on or after August 6, 2027 upon the fulfillment of certain conditions. The Convertible Notes will be convertible into shares of our common stock at the Purchaser’s sole discretion at an initial conversion rate of 129.0323 shares of our common stock per $1,000 principal amount of the Convertible Notes, subject to adjustments and limitations as provided in the Note Purchase Agreement.   The net proceeds from the offering of the Convertible Notes will be used to repay outstanding indebtedness, make investments in accordance with our investment objective and investment strategy, and for other general corporate purposes. The Note Purchase Agreement includes customary representations, warranties, and covenants by the Company.

    Subsequent to quarter-end, pursuant to the Note Purchase Agreement, on October 9, 2024 we issued and sold, and the Purchaser purchased, $5.0 million in aggregate principal amount of the Additional Notes. The Additional Notes are treated as a single series with the Initial Notes and have the same terms as the Initial Notes. The Additional Notes are fungible and rank equally with the Initial Notes. Upon issuance of the Additional Notes, the outstanding aggregate principal amount of our Convertible Notes became $30.0 million.

    Note Repurchase Program
    On August 6, 2024, SuRo Capital’s Board of Directors approved a discretionary note repurchase program (the “Note Repurchase Program”) which allows the Company to repurchase up to 46.67%, or $35.0 million in aggregate principal amount, of our 6.00% Notes due 2026 (the “6.00% Notes”) through open market purchases, including block purchases, in such manner as will comply with the provisions of the Investment Company Act of 1940, as amended (the “1940 Act”) and the Securities Exchange Act of 1934, as amended (the “Exchange Act”). As of September 30, 2024, we had repurchased 1,010,136 of the 6.00% Notes due 2026 under the Note Repurchase Program.

    Subsequent to quarter-end through October 10, 2024, we repurchased an additional 201,446 of the 6.00% Notes due 2026 under the Note Repurchase Program. The aggregate principal dollar amount of 6.00% Notes that may yet be repurchased by SuRo Capital under the Note Repurchase Program is approximately $4.7 million.

    Share Repurchase Program
    Under the Share Repurchase Program, the Company may repurchase its outstanding common stock in the open market, provided it complies with the prohibitions under its insider trading policies and procedures and the applicable provisions of the 1940 Act and the Exchange Act.

    Since inception of the Share Repurchase Program in August 2017, SuRo Capital has repurchased over 6.0 million shares of its common stock for an aggregate purchase price of approximately $39.3 million. This does not include repurchases under various tender offers during this time period. The dollar value of shares that may yet be purchased by SuRo Capital under the Share Repurchase Program is approximately $20.7 million. The Share Repurchase Program is authorized through October 31, 2024.

    Preliminary Estimates and Guidance
    The preliminary financial estimates provided herein are unaudited and have been prepared by, and are the responsibility of, the management of SuRo Capital. Neither our independent registered public accounting firm, nor any other independent accountants, have audited, reviewed, compiled, or performed any procedures with respect to the preliminary financial data included herein. Actual results may differ materially.

    The Company expects to announce its third quarter ended September 30, 2024 results in November 2024.

    Forward-Looking Statements
    Statements included herein, including statements regarding SuRo Capital’s beliefs, expectations, intentions, or strategies for the future, may constitute “forward-looking statements”. SuRo Capital cautions you that forward-looking statements are not guarantees of future performance and that actual results or developments may differ materially from those projected or implied in these statements. All forward-looking statements involve a number of risks and uncertainties, including the impact of any market volatility that may be detrimental to our business, our portfolio companies, our industry, and the global economy, that could cause actual results to differ materially from the plans, intentions, and expectations reflected in or suggested by the forward-looking statements. Risk factors, cautionary statements, and other conditions which could cause SuRo Capital’s actual results to differ from management’s current expectations are contained in SuRo Capital’s filings with the Securities and Exchange Commission. SuRo Capital undertakes no obligation to update any forward-looking statement to reflect events or circumstances that may arise after the date of this press release.

    About SuRo Capital Corp.
    SuRo Capital Corp. (Nasdaq: SSSS) is a publicly traded investment fund that seeks to invest in high-growth, venture-backed private companies. The fund seeks to create a portfolio of high-growth emerging private companies via a repeatable and disciplined investment approach, as well as to provide investors with access to such companies through its publicly traded common stock. SuRo Capital is headquartered in New York, NY and has offices in San Francisco, CA. Connect with the company on X, LinkedIn, and at http://www.surocap.com.

    Contact
    SuRo Capital Corp.
    (212) 931-6331
    IR@surocap.com

    The MIL Network –

    January 23, 2025
  • MIL-OSI: SPS Commerce Announces Date of Third Quarter 2024 Financial Results

    Source: GlobeNewswire (MIL-OSI)

    MINNEAPOLIS, Oct. 10, 2024 (GLOBE NEWSWIRE) — SPS Commerce, Inc. (NASDAQ: SPSC), a leader in retail supply chain cloud services, today announced that it will issue its financial results for the third quarter ended September 30, 2024, after the market close on Thursday, October 24, 2024. SPS Commerce will host a call to discuss the results at 3:30 p.m. Central Time (4:30 p.m. Eastern Time) on the same day.

    To access the call, please dial 1-833-816-1382, or outside the U.S. 1-412-317-0475 at least 15 minutes prior to the 3:30 p.m. CT start time. Please ask to join the SPS Commerce Q3 2024 conference call. A live webcast of the call will also be available at http://investors.spscommerce.com under the Events and Presentations menu. The replay will also be available on our website at http://investors.spscommerce.com.

    About SPS Commerce

    SPS Commerce is the world’s leading retail network, connecting trading partners around the globe to optimize supply chain operations for all retail partners. We support data-driven partnerships with innovative cloud technology, customer-obsessed service and accessible experts so our customers can focus on what they do best. To date, more than 120,000 companies in retail, grocery, distribution, supply, and logistics have chosen SPS as their retail network. SPS has achieved 94 consecutive quarters of revenue growth and is headquartered in Minneapolis. For additional information, contact SPS at 866-245-8100 or visit http://www.spscommerce.com.

    SPS COMMERCE, SPS, SPS logo and INFINITE RETAIL POWER are marks of SPS Commerce, Inc. and registered in the U.S. Patent and Trademark Office, along with other SPS marks. Such marks may also be registered or otherwise protected in other countries. 

    Contact:

    Investor Relations
    The Blueshirt Group
    Irmina Blaszczyk
    Lisa Laukkanen
    SPSC@blueshirtgroup.com
    415-217-4962  

    SPS-F

    The MIL Network –

    January 23, 2025
  • MIL-OSI: Check Point Software Recognized by Forbes for Fifth Consecutive Year as World’s Top Notch Cyber Security Employer

    Source: GlobeNewswire (MIL-OSI)

    REDWOOD CITY, Calif., Oct. 10, 2024 (GLOBE NEWSWIRE) — Check Point Software Technologies Ltd. (NASDAQ: CHKP), has been named as one of the World’s Best Employers by Forbes for the fifth year in a row. With over 6,500 employees around the world, Check Point is once again recognized as a leading cyber security employer by Forbes and was recently recognized as one of the World’s Best Companies by TIME.

    “Our employees are Check Point’s greatest asset in our mission to secure the world from cyber threats,” said Yiftah Yoffe, Chief HR Officer at Check Point Software. “We strive every day to create an inclusive and innovation-minded culture to support and encourage our employees. We are proud to be recognized for the fifth year by our employees and their peers for being the world’s top notch cyber security employer.”

    The ranking is the result of comprehensive research on employer quality conducted on a global scale in partnership with Statista. The analysis included a survey of more than 300,000 employees in over 50 countries who work for multinational corporations that employ more than 1,000 workers and operate in at least two of the six continental regions of the world. This led to millions of data points. Check Point was highly ranked in the list, earning spot #43 in the prestigious IT, Internet, Software & Services category and #613 in the full list of 850 organizations.

    Check Point takes its Environmental, Social, and Governance (ESG) responsibility seriously. The company recently released its 2023 ESG report: Security through Sustainability and Action, including its progress in achieving carbon neutrality by 2040, training people in cyber security skills for a safer digital world, dedication to diversity and inclusion, and philanthropy efforts.

    Follow Check Point via:
    LinkedIn: https://www.linkedin.com/company/check-point-software-technologies
    Twitter: https://www.twitter.com/checkpointsw
    Facebook: https://www.facebook.com/checkpointsoftware
    Blog: https://blog.checkpoint.com
    YouTube: https://www.youtube.com/user/CPGlobal

    About Check Point Software Technologies Ltd. 
    Check Point Software Technologies Ltd. (http://www.checkpoint.com) is a leading AI-powered, cloud-delivered cyber security platform provider protecting over 100,000 organizations worldwide. Check Point leverages the power of AI everywhere to enhance cyber security efficiency and accuracy through its Infinity Platform, with industry-leading catch rates enabling proactive threat anticipation and smarter, faster response times. The comprehensive platform includes cloud-delivered technologies consisting of Check Point Harmony to secure the workspace, Check Point CloudGuard to secure the cloud, Check Point Quantum to secure the network, and Check Point Infinity Platform Services for collaborative security operations and services.

    MEDIA CONTACT:
    Liz Wu
    Check Point Software Technologies
    press@checkpoint.com

    INVESTOR CONTACT:
    Kip E. Meintzer
    Check Point Software Technologies
    ir@checkpoint.com

    The MIL Network –

    January 23, 2025
  • MIL-OSI: AMD Delivers Leadership AI Performance with AMD Instinct MI325X Accelerators

    Source: GlobeNewswire (MIL-OSI)

    ─ Latest accelerators offer market leading HBM3E memory capacity and are supported by partners and customers including Dell Technologies, HPE, Lenovo, Supermicro and others ─

    ─ AMD Pensando Salina DPU offers 2X generational performance and AMD Pensando Pollara 400 is industry’s first UEC ready NIC─

    SAN FRANCISCO, Oct. 10, 2024 (GLOBE NEWSWIRE) — Today, AMD (NASDAQ: AMD) announced the latest accelerator and networking solutions that will power the next generation of AI infrastructure at scale: AMD Instinct™ MI325X accelerators, the AMD Pensando™ Pollara 400 NIC and the AMD Pensando Salina DPU. AMD Instinct MI325X accelerators set a new standard in performance for Gen AI models and data centers.

    Built on the AMD CDNA™ 3 architecture, AMD Instinct MI325X accelerators are designed for exceptional performance and efficiency for demanding AI tasks spanning foundation model training, fine-tuning and inferencing. Together, these products enable AMD customers and partners to create highly performant and optimized AI solutions at the system, rack and data center level.

    “AMD continues to deliver on our roadmap, offering customers the performance they need and the choice they want, to bring AI infrastructure, at scale, to market faster,” said Forrest Norrod, executive vice president and general manager, Data Center Solutions Business Group, AMD. “With the new AMD Instinct accelerators, EPYC processors and AMD Pensando networking engines, the continued growth of our open software ecosystem, and the ability to tie this all together into optimized AI infrastructure, AMD underscores the critical expertise to build and deploy world class AI solutions.”

    AMD Instinct MI325X Extends Leading AI Performance
    AMD Instinct MI325X accelerators deliver industry-leading memory capacity and bandwidth, with 256GB of HBM3E supporting 6.0TB/s offering 1.8X more capacity and 1.3x more bandwidth than the H2001. The AMD Instinct MI325X also offers 1.3X greater peak theoretical FP16 and FP8 compute performance compared to H2001.

    This leadership memory and compute can provide up to 1.3X the inference performance on Mistral 7B at FP162, 1.2X the inference performance on Llama 3.1 70B at FP83 and 1.4X the inference performance on Mixtral 8x7B at FP16 of the H2004.

    AMD Instinct MI325X accelerators are currently on track for production shipments in Q4 2024 and are expected to have widespread system availability from a broad set of platform providers, including Dell Technologies, Eviden, Gigabyte, Hewlett Packard Enterprise, Lenovo, Supermicro and others starting in Q1 2025.

    Continuing its commitment to an annual roadmap cadence, AMD previewed the next-generation AMD Instinct MI350 series accelerators. Based on AMD CDNA 4 architecture, AMD Instinct MI350 series accelerators are designed to deliver a 35x improvement in inference performance compared to AMD CDNA 3-based accelerators5.

    The AMD Instinct MI350 series will continue to drive memory capacity leadership with up to 288GB of HBM3E memory per accelerator. The AMD Instinct MI350 series accelerators are on track to be available during the second half of 2025.

    AMD Next-Gen AI Networking
    AMD is leveraging the most widely deployed programmable DPU for hyperscalers to power next-gen AI networking. Split into two parts: the front-end, which delivers data and information to an AI cluster, and the backend, which manages data transfer between accelerators and clusters, AI networking is critical to ensuring CPUs and accelerators are utilized efficiently in AI infrastructure.

    To effectively manage these two networks and drive high performance, scalability and efficiency across the entire system, AMD introduced the AMD Pensando™ Salina DPU for the front-end and the AMD Pensando™ Pollara 400, the industry’s first Ultra Ethernet Consortium (UEC) ready AI NIC, for the back-end.

    The AMD Pensando Salina DPU is the third generation of the world’s most performant and programmable DPU, bringing up to 2X the performance, bandwidth and scale compared to the previous generation. Supporting 400G throughput for fast data transfer rates, the AMD Pensando Salina DPU is a critical component in AI front-end network clusters, optimizing performance, efficiency, security and scalability for data-driven AI applications.

    The UEC-ready AMD Pensando Pollara 400, powered by the AMD P4 Programmable engine, is the industry’s first UEC-ready AI NIC. It supports the next-gen RDMA software and is backed by an open ecosystem of networking. The AMD Pensando Pollara 400 is critical for providing leadership performance, scalability and efficiency of accelerator-to-accelerator communication in back-end networks.

    Both the AMD Pensando Salina DPU and AMD Pensando Pollara 400 are sampling with customers in Q4’24 and are on track for availability in the first half of 2025.

    AMD AI Software Delivering New Capabilities for Generative AI
    AMD continues its investment in driving software capabilities and the open ecosystem to deliver powerful new features and capabilities in the AMD ROCm™ open software stack.

    Within the open software community, AMD is driving support for AMD compute engines in the most widely used AI frameworks, libraries and models including PyTorch, Triton, Hugging Face and many others. This work translates to out-of-the-box performance and support with AMD Instinct accelerators on popular generative AI models like Stable Diffusion 3, Meta Llama 3, 3.1 and 3.2 and more than one million models at Hugging Face.

    Beyond the community, AMD continues to advance its ROCm open software stack, bringing the latest features to support leading training and inference on Generative AI workloads. ROCm 6.2 now includes support for critical AI features like FP8 datatype, Flash Attention 3, Kernel Fusion and more. With these new additions, ROCm 6.2, compared to ROCm 6.0, provides up to a 2.4X performance improvement on inference6 and 1.8X on training for a variety of LLMs7.

    Supporting Resources

    • Follow AMD on LinkedIn
    • Follow AMD on Twitter
    • Read more about AMD Next Generation AI Networking here
    • Read more about AMD Instinct Accelerators here
    • Visit the AMD Advancing AI: 2024 event page

    About AMD
    For more than 50 years AMD has driven innovation in high-performance computing, graphics, and visualization technologies. Billions of people, leading Fortune 500 businesses, and cutting-edge scientific research institutions around the world rely on AMD technology daily to improve how they live, work, and play. AMD employees are focused on building leadership high-performance and adaptive products that push the boundaries of what is possible. For more information about how AMD is enabling today and inspiring tomorrow, visit the AMD (NASDAQ: AMD) website, blog, LinkedIn, and X pages.

    CAUTIONARY STATEMENT

    This press release contains forward-looking statements concerning Advanced Micro Devices, Inc. (AMD) such as the features, functionality, performance, availability, timing and expected benefits of AMD products including the AMD Instinct™ MI325X accelerators; AMD Pensando™ Salina DPU; AMD Pensando Pollara 400; continued growth of AMD’s open software ecosystem; AMD Instinct MI350 series accelerators, which are made pursuant to the Safe Harbor provisions of the Private Securities Litigation Reform Act of 1995. Forward-looking statements are commonly identified by words such as “would,” “may,” “expects,” “believes,” “plans,” “intends,” “projects” and other terms with similar meaning. Investors are cautioned that the forward-looking statements in this press release are based on current beliefs, assumptions and expectations, speak only as of the date of this press release and involve risks and uncertainties that could cause actual results to differ materially from current expectations. Such statements are subject to certain known and unknown risks and uncertainties, many of which are difficult to predict and generally beyond AMD’s control, that could cause actual results and other future events to differ materially from those expressed in, or implied or projected by, the forward-looking information and statements. Material factors that could cause actual results to differ materially from current expectations include, without limitation, the following: Intel Corporation’s dominance of the microprocessor market and its aggressive business practices; Nvidia’s dominance in the graphics processing unit market and its aggressive business practices; the cyclical nature of the semiconductor industry; market conditions of the industries in which AMD products are sold; loss of a significant customer; competitive markets in which AMD’s products are sold; economic and market uncertainty; quarterly and seasonal sales patterns; AMD’s ability to adequately protect its technology or other intellectual property; unfavorable currency exchange rate fluctuations; ability of third party manufacturers to manufacture AMD’s products on a timely basis in sufficient quantities and using competitive technologies; availability of essential equipment, materials, substrates or manufacturing processes; ability to achieve expected manufacturing yields for AMD’s products; AMD’s ability to introduce products on a timely basis with expected features and performance levels; AMD’s ability to generate revenue from its semi-custom SoC products; potential security vulnerabilities; potential security incidents including IT outages, data loss, data breaches and cyberattacks; uncertainties involving the ordering and shipment of AMD’s products; AMD’s reliance on third-party intellectual property to design and introduce new products; AMD’s reliance on third-party companies for design, manufacture and supply of motherboards, software, memory and other computer platform components; AMD’s reliance on Microsoft and other software vendors’ support to design and develop software to run on AMD’s products; AMD’s reliance on third-party distributors and add-in-board partners; impact of modification or interruption of AMD’s internal business processes and information systems; compatibility of AMD’s products with some or all industry-standard software and hardware; costs related to defective products; efficiency of AMD’s supply chain; AMD’s ability to rely on third party supply-chain logistics functions; AMD’s ability to effectively control sales of its products on the gray market; long-term impact of climate change on AMD’s business; impact of government actions and regulations such as export regulations, tariffs and trade protection measures; AMD’s ability to realize its deferred tax assets; potential tax liabilities; current and future claims and litigation; impact of environmental laws, conflict minerals related provisions and other laws or regulations; evolving expectations from governments, investors, customers and other stakeholders regarding corporate responsibility matters; issues related to the responsible use of AI; restrictions imposed by agreements governing AMD’s notes, the guarantees of Xilinx’s notes and the revolving credit agreement; impact of acquisitions, joint ventures and/or investments on AMD’s business and AMD’s ability to integrate acquired businesses;  impact of any impairment of the combined company’s assets; political, legal and economic risks and natural disasters; future impairments of technology license purchases; AMD’s ability to attract and retain qualified personnel; and AMD’s stock price volatility. Investors are urged to review in detail the risks and uncertainties in AMD’s Securities and Exchange Commission filings, including but not limited to AMD’s most recent reports on Forms 10-K and 10-Q.

    AMD, the AMD Arrow logo, AMD CDNA, AMD Instinct, Pensando, ROCm, and combinations thereof are trademarks of Advanced Micro Devices, Inc. Other names are for informational purposes only and may be trademarks of their respective owners.

    ________________________________

    1MI325-002 -Calculations conducted by AMD Performance Labs as of May 28th, 2024 for the AMD Instinct™ MI325X GPU resulted in 1307.4 TFLOPS peak theoretical half precision (FP16), 1307.4 TFLOPS peak theoretical Bfloat16 format precision (BF16), 2614.9 TFLOPS peak theoretical 8-bit precision (FP8), 2614.9 TOPs INT8 floating-point performance. Actual performance will vary based on final specifications and system configuration.
    Published results on Nvidia H200 SXM (141GB) GPU: 989.4 TFLOPS peak theoretical half precision tensor (FP16 Tensor), 989.4 TFLOPS peak theoretical Bfloat16 tensor format precision (BF16 Tensor), 1,978.9 TFLOPS peak theoretical 8-bit precision (FP8), 1,978.9 TOPs peak theoretical INT8 floating-point performance. BFLOAT16 Tensor Core, FP16 Tensor Core, FP8 Tensor Core and INT8 Tensor Core performance were published by Nvidia using sparsity; for the purposes of comparison, AMD converted these numbers to non-sparsity/dense by dividing by 2, and these numbers appear above. 
    Nvidia H200 source:  https://nvdam.widen.net/s/nb5zzzsjdf/hpc-datasheet-sc23-h200-datasheet-3002446 and https://www.anandtech.com/show/21136/nvidia-at-sc23-h200-accelerator-with-hbm3e-and-jupiter-supercomputer-for-2024
    Note: Nvidia H200 GPUs have the same published FLOPs performance as H100 products https://resources.nvidia.com/en-us-tensor-core/.

    2 Based on testing completed on 9/28/2024 by AMD performance lab measuring overall latency for Mistral-7B model using FP16 datatype. Test was performed using input length of 128 tokens and an output length of 128 tokens for the following configurations of AMD Instinct™ MI325X GPU accelerator and NVIDIA H200 SXM GPU accelerator.

    1x MI325X at 1000W with vLLM performance: 0.637 sec (latency in seconds)
    Vs.
    1x H200 at 700W with TensorRT-LLM: 0.811 sec (latency in seconds)

    Configurations:
    AMD Instinct™ MI325X reference platform:
    1x AMD Ryzen™ 9 7950X 16-Core Processor CPU, 1x AMD Instinct MI325X (256GiB, 1000W) GPU, Ubuntu® 22.04, and ROCm™ 6.3 pre-release
    Vs
    NVIDIA H200 HGX platform:
    Supermicro SuperServer with 2x Intel Xeon® Platinum 8468 Processors, 8x Nvidia H200 (140GB, 700W) GPUs [only 1 GPU was used in this test], Ubuntu 22.04), CUDA 12.6 Server manufacturers may vary configurations, yielding different results. Performance may vary based on use of latest drivers and optimizations. MI325-005

    3 MI325-006: Based on testing completed on 9/28/2024 by AMD performance lab measuring overall latency for LLaMA 3.1-70B model using FP8 datatype. Test was performed using input length of 2048 tokens and an output length of 2048 tokens for the following configurations of AMD Instinct™ MI325X GPU accelerator and NVIDIA H200 SXM GPU accelerator.

    1x MI325X at 1000W with vLLM performance: 48.025 sec (latency in seconds)
    Vs.
    1x H200 at 700W with TensorRT-LLM: 62.688 sec (latency in seconds)

    Configurations:
    AMD Instinct™ MI325X reference platform:
    1x AMD Ryzen™ 9 7950X 16-Core Processor CPU, 1x AMD Instinct MI325X (256GiB, 1000W) GPU, Ubuntu® 22.04, and ROCm™ 6.3 pre-release
    Vs
    NVIDIA H200 HGX platform:
    Supermicro SuperServer with 2x Intel Xeon® Platinum 8468 Processors, 8x Nvidia H200 (140GB, 700W) GPUs, Ubuntu 22.04), CUDA 12.6

    Server manufacturers may vary configurations, yielding different results. Performance may vary based on use of latest drivers and optimizations.

    4 MI325-004: Based on testing completed on 9/28/2024 by AMD performance lab measuring text generated throughput for Mixtral-8x7B model using FP16 datatype. Test was performed using input length of 128 tokens and an output length of 4096 tokens for the following configurations of AMD Instinct™ MI325X GPU accelerator and NVIDIA H200 SXM GPU accelerator.

    1x MI325X at 1000W with vLLM performance: 4598 (Output tokens / sec)
    Vs.
    1x H200 at 700W with TensorRT-LLM: 2700.7 (Output tokens / sec)

    Configurations:
    AMD Instinct™ MI325X reference platform:
    1x AMD Ryzen™ 9 7950X CPU, 1x AMD Instinct MI325X (256GiB, 1000W) GPU, Ubuntu® 22.04, and ROCm™ 6.3 pre-release
    Vs
    NVIDIA H200 HGX platform:
    Supermicro SuperServer with 2x Intel Xeon® Platinum 8468 Processors, 8x Nvidia H200 (140GB, 700W) GPUs [only 1 GPU was used in this test], Ubuntu 22.04) CUDA® 12.6

    Server manufacturers may vary configurations, yielding different results. Performance may vary based on use of latest drivers and optimizations.

    5 CDNA4-03: Inference performance projections as of May 31, 2024 using engineering estimates based on the design of a future AMD CDNA 4-based Instinct MI350 Series accelerator as proxy for projected AMD CDNA™ 4 performance. A 1.8T GPT MoE model was evaluated assuming a token-to-token latency = 70ms real time, first token latency = 5s, input sequence length = 8k, output sequence length = 256, assuming a 4x 8-mode MI350 series proxy (CDNA4) vs. 8x MI300X per GPU performance comparison.. Actual performance will vary based on factors including but not limited to final specifications of production silicon, system configuration and inference model and size used.

    6 MI300-62: Testing conducted by internal AMD Performance Labs as of September 29, 2024 inference performance comparison between ROCm 6.2 software and ROCm 6.0 software on the systems with 8 AMD Instinct™ MI300X GPUs coupled with Llama 3.1-8B, Llama 3.1-70B, Mixtral-8x7B, Mixtral-8x22B, and Qwen 72B models.

    ROCm 6.2 with vLLM 0.5.5 performance was measured against the performance with ROCm 6.0 with vLLM 0.3.3, and tests were performed across batch sizes of 1 to 256 and sequence lengths of 128 to 2048.

    Configurations:
    1P AMD EPYC™ 9534 CPU server with 8x AMD Instinct™ MI300X (192GB, 750W) GPUs, Supermicro AS-8125GS-TNMR2, NPS1 (1 NUMA per socket), 1.5 TiB (24 DIMMs, 4800 mts memory, 64 GiB/DIMM), 4x 3.49TB Micron 7450 storage, BIOS version: 1.8, , ROCm 6.2.0-00, vLLM 0.5.5, PyTorch 2.4.0, Ubuntu® 22.04 LTS with Linux kernel 5.15.0-119-generic.
    vs.
    1P AMD EPYC 9534 CPU server with 8x AMD Instinct™ MI300X (192GB, 750W) GPUs, Supermicro AS-8125GS-TNMR2, NPS1 (1 NUMA per socket), 1.5TiB 24 DIMMs, 4800 mts memory, 64 GiB/DIMM), 4x 3.49TB Micron 7450 storage, BIOS version: 1.8, ROCm 6.0.0-00, vLLM 0.3.3, PyTorch 2.1.1, Ubuntu 22.04 LTS with Linux kernel 5.15.0-119-generic.

    Server manufacturers may vary configurations, yielding different results. Performance may vary based on factors including but not limited to different versions of configurations, vLLM, and drivers.

    7 MI300-61: Measurements conducted by AMD AI Product Management team on AMD Instinct™ MI300X GPU for comparing large language model (LLM) performance with optimization methodologies enabled and disabled as of 9/28/2024 on Llama 3.1-70B and Llama 3.1-405B and vLLM 0.5.5.

    System Configurations:
    – AMD EPYC 9654 96-Core Processor, 8 x AMD MI300X, ROCm™ 6.1, Linux® 7ee7e017abe3 5.15.0-116-generic #126-Ubuntu® SMP Mon Jul 1 10:14:24 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux, Frequency boost: enabled.

    Performance may vary on factors including but not limited to different versions of configurations, vLLM, and drivers.

    Contact:
    Aaron Grabein
     AMD Communications
    +1 737-256-9518
    aaron.grabein@amd.com

    Mitch Haws
    AMD Investor Relations
    +1 512-944-0790 
    mitch.haws@amd.com

    The MIL Network –

    January 23, 2025
  • MIL-OSI: AMD Launches New Ryzen™ AI PRO 300 Series Processors to Power Next Generation of Commercial PCs

    Source: GlobeNewswire (MIL-OSI)

    – New processors deliver unprecedented AI compute capabilities1and multi-day battery life2, enabling incredible productivity for business users –

    – AMD continues to expand commercial portfolio; more than 100 Ryzen AI PRO PCs on-track to launch through 2025 –

    SAN FRANCISCO, Oct. 10, 2024 (GLOBE NEWSWIRE) — Today, AMD (NASDAQ: AMD) announced its third generation commercial AI mobile processors, designed specifically to transform business productivity with Copilot+ features including live captioning and language translation in conference calls and advanced AI image generators. The new Ryzen AI PRO 300 Series processors deliver industry-leading AI compute3, with up to three times the AI performance than the previous generation4, and offer uncompromising performance for everyday workloads. Enabled with AMD PRO Technologies, the Ryzen AI PRO 300 Series processors offer world-class security and manageability features designed to streamline IT operations and ensure exceptional ROI for businesses.

    Ryzen AI PRO 300 Series processors feature new AMD “Zen 5” architecture, delivering outstanding CPU performance, and are the world’s best line up of commercial processors for Copilot+ enterprise PCs5. Laptops equipped with Ryzen AI PRO 300 Series processors are designed to tackle business’ toughest workloads, with the top-of-stack Ryzen AI 9 HX PRO 375 offering up to 40% higher performance6 and up to 14% faster productivity performance7 compared to Intel’s Core Ultra 7 165U. With the addition of XDNA™ 2 architecture powering the integrated NPU, AMD Ryzen AI PRO 300 Series processors offer a cutting-edge 50+ NPU TOPS (Trillions of Operations Per Second) of AI processing power, exceeding Microsoft’s Copilot+ AI PC requirements89 and delivering exceptional AI compute and productivity capabilities for the modern business. Built on a 4nm process and with innovative power management, the new processors deliver extended battery life ideal for sustained performance and productivity on the go.

    “Enterprises are increasingly demanding more compute power and efficiency to drive their everyday tasks and most taxing workloads. We are excited to add the Ryzen AI PRO 300 Series, the most powerful AI processor built for business PCs10, to our portfolio of mobile processors,” said Jack Huynh, senior vice president and general manager, Computing and Graphics Group at AMD. “Our third generation AI-enabled processors for business PCs deliver unprecedented AI processing capabilities with incredible battery life and seamless compatibility for the applications users depend on.”

    AMD Ryzen AI PRO 300 Series Mobile Processors

    Model Cores/Threads Boost11/ Base Frequency Total Cache Graphics Model
    AMD
    cTDP TOPS
    AMD Ryzen™ AI 9 HX PRO 375 12C/24T Up to 5.1GHz/
    2GHz
    36MB Radeon™ 890M Graphics 15-54W Up to 55
    AMD Ryzen™ AI 9 HX PRO 370 12C/24T Up to 5.1GHz/
    2GHz
    36MB Radeon™ 890M Graphics 15-54W Up to 50
    AMD Ryzen™ AI 7 PRO 360 8C/16T Up to 5GHz/
    2GHz
    24MB AMD Radeon™ 880M Graphics 15-54W Up to 50


    AMD Continues to Expand Commercial OEM Ecosystem

    OEM partners continue to expand their commercial offerings with new PCs powered by Ryzen AI PRO 300 Series processors, delivering well-rounded performance and compatibility to their business customers. With industry leading TOPS, the next generation of Ryzen processor-powered commercial PCs are set to expand the possibilities of local AI processing with Microsoft Copilot+. OEM systems powered by Ryzen AI PRO 300 Series are expected to be on shelf starting later this year.

    “Microsoft’s partnership with AMD and the integration of Ryzen AI PRO processors into Copilot+ PCs demonstrate our joint focus on delivering impactful AI-driven experiences for our customers. The Ryzen AI PRO’s performance, combined with the latest features in Windows 11, enhances productivity, efficiency, and security,” said Pavan Davuluri, corporate vice president, Windows+ Devices, Microsoft. “Features like Improved Windows Search, Recall, and Click to Do make PCs more intuitive and responsive. Security enhancements, including the Microsoft Pluton security processor and Windows Hello Enhanced Sign-in Security, help safeguard customer data with advanced protection. We’re proud of our strong history of collaboration with AMD and are thrilled to bring these innovations to market.”

    “In today’s AI-powered era of computing, HP is dedicated to delivering powerful innovation and performance that revolutionizes the way people work,” said Alex Cho, president of Personal Systems, HP. “With the HP EliteBook X Next-Gen AI PC, we are empowering modern leaders to push boundaries without compromising power or performance. We are proud to expand our AI PC lineup powered by AMD, providing our commercial customers with a truly personalized experience.”

    “Lenovo’s partnership with AMD continues to drive AI PC innovation and deliver supreme performance for our business customers. Our recently announced ThinkPad T14s Gen 6 AMD, powered by the latest AMD Ryzen AI PRO 300 Series processors, showcases the strength of our collaboration,” said Luca Rossi, president, Lenovo Intelligent Devices Group. “This device offers outstanding AI computing power, enhanced security, and exceptional battery life, providing professionals with the tools they need to maximize productivity and efficiency. Together with AMD, we are transforming the business landscape by delivering smarter, AI-driven solutions that empower users to achieve more.”

    New PRO Technologies Features Build Upon Leadership Security and Management Features

    In addition to AMD Secure Processor12, AMD Shadow Stack and AMD Platform Secure Boot, AMD has expanded its PRO Technologies lineup with new security and manageability features. Processors equipped with PRO Technologies will now come standard with Cloud Bare Metal Recovery, allowing IT teams to seamlessly recover systems via the cloud ensuring smooth and continuous operations; Supply Chain Security (AMD Device Identity), a new supply chain security function, enabling traceability across the supply chain; and Watch Dog Timer, building on existing resiliency support with additional detection and recovery processes.

    Additional AI-based malware detection is available via PRO Technologies with select ISV partners. These new security features leverage the integrated NPU to run AI-based security workloads without impacting day-to-day performance.

    Supporting Resources

    About AMD
    For more than 50 years AMD has driven innovation in high-performance computing, graphics and visualization technologies. Billions of people, leading Fortune 500 businesses and cutting-edge scientific research institutions around the world rely on AMD technology daily to improve how they live, work and play. AMD employees are focused on building leadership high-performance and adaptive products that push the boundaries of what is possible. For more information about how AMD is enabling today and inspiring tomorrow, visit the AMD (NASDAQ: AMD) website, blog, LinkedIn and X pages.

    Cautionary Statement
    This press release contains forward-looking statements concerning Advanced Micro Devices, Inc. (AMD) such as the features, functionality, performance, availability, timing and expected benefits of AMD products including the AMD Ryzen™ AI PRO 300 Series mobile processors, which are made pursuant to the Safe Harbor provisions of the Private Securities Litigation Reform Act of 1995. Forward-looking statements are commonly identified by words such as “would,” “may,” “expects,” “believes,” “plans,” “intends,” “projects” and other terms with similar meaning. Investors are cautioned that the forward-looking statements in this press release are based on current beliefs, assumptions and expectations, speak only as of the date of this press release and involve risks and uncertainties that could cause actual results to differ materially from current expectations. Such statements are subject to certain known and unknown risks and uncertainties, many of which are difficult to predict and generally beyond AMD’s control, that could cause actual results and other future events to differ materially from those expressed in, or implied or projected by, the forward-looking information and statements. Material factors that could cause actual results to differ materially from current expectations include, without limitation, the following: Intel Corporation’s dominance of the microprocessor market and its aggressive business practices; Nvidia’s dominance in the graphics processing unit market and its aggressive business practices; the cyclical nature of the semiconductor industry; market conditions of the industries in which AMD products are sold; loss of a significant customer; competitive markets in which AMD’s products are sold; economic and market uncertainty; quarterly and seasonal sales patterns; AMD’s ability to adequately protect its technology or other intellectual property; unfavorable currency exchange rate fluctuations; ability of third party manufacturers to manufacture AMD’s products on a timely basis in sufficient quantities and using competitive technologies; availability of essential equipment, materials, substrates or manufacturing processes; ability to achieve expected manufacturing yields for AMD’s products; AMD’s ability to introduce products on a timely basis with expected features and performance levels; AMD’s ability to generate revenue from its semi-custom SoC products; potential security vulnerabilities; potential security incidents including IT outages, data loss, data breaches and cyberattacks; uncertainties involving the ordering and shipment of AMD’s products; AMD’s reliance on third-party intellectual property to design and introduce new products; AMD’s reliance on third-party companies for design, manufacture and supply of motherboards, software, memory and other computer platform components; AMD’s reliance on Microsoft and other software vendors’ support to design and develop software to run on AMD’s products; AMD’s reliance on third-party distributors and add-in-board partners; impact of modification or interruption of AMD’s internal business processes and information systems; compatibility of AMD’s products with some or all industry-standard software and hardware; costs related to defective products; efficiency of AMD’s supply chain; AMD’s ability to rely on third party supply-chain logistics functions; AMD’s ability to effectively control sales of its products on the gray market; long-term impact of climate change on AMD’s business; impact of government actions and regulations such as export regulations, tariffs and trade protection measures; AMD’s ability to realize its deferred tax assets; potential tax liabilities; current and future claims and litigation; impact of environmental laws, conflict minerals related provisions and other laws or regulations; evolving expectations from governments, investors, customers and other stakeholders regarding corporate responsibility matters; issues related to the responsible use of AI; restrictions imposed by agreements governing AMD’s notes, the guarantees of Xilinx’s notes and the revolving credit agreement; impact of acquisitions, joint ventures and/or investments on AMD’s business and AMD’s ability to integrate acquired businesses; impact of any impairment of the combined company’s assets; political, legal and economic risks and natural disasters; future impairments of technology license purchases; AMD’s ability to attract and retain qualified personnel; and AMD’s stock price volatility. Investors are urged to review in detail the risks and uncertainties in AMD’s Securities and Exchange Commission filings, including but not limited to AMD’s most recent reports on Forms 10-K and 10-Q.

    © 2024 Advanced Micro Devices, Inc. All rights reserved. AMD, the AMD Arrow logo, Radeon, RDNA, Ryzen, XDNA and combinations thereof are trademarks of Advanced Micro Devices, Inc. Certain AMD technologies may require third-party enablement or activation. Supported features may vary by operating system. Please confirm with the system manufacturer for specific features. No technology or product can be completely secure.

    The information contained herein is for informational purposes only and is subject to change without notice. Timelines, roadmaps, and/or product release dates shown in this Press Release are plans only and subject to change.


    1 As of May 2023, AMD has the first available dedicated AI engine on an x86 Windows processor, where ‘dedicated AI engine’ is defined as an AI engine that has no function other than to process AI inference models and is part of the x86 processor die. For detailed information, please check: https://www.amd.com/en/technologies/xdna.html. PHX-3a.
    2 All battery life claims are approximate. Actual battery life will vary based on several factors, including, but not limited to: product configuration and usage, software, operating conditions, wireless functionality, power management settings, screen brightness and other factors. The maximum capacity of the battery will naturally decrease with time and use. AMD has not independently tested or verified the battery life claim. GD-168.
    3 Based on AMD product specifications and competitive products announced as of Oct 2024. AMD Ryzen™ AI PRO 300 Series processors’ NPU offers up to 55 peak TOPS. This is the most TOPS offered on any system found in enterprise today. AI PC is defined as a laptop PC with a processor that includes a neural processing unit (NPU). STXP-06.
    4 Based on TOPS specification of AMD Ryzen™ AI 300 Series processors with 50 TOPS compared to an AMD Ryzen 8040 Series processors with 16 TOPS as of June 2024. STX-01. 
    5 Based on product specifications and competitive products announced as of Oct 2024 and testing as of Sept 2024 by AMD performance labs using the following systems: HP EliteBook X G1a with AMD Ryzen AI 9 HX PRO 375 processor @23W, Radeon 880M graphics, 32GB of RAM, 512GB SSD, VBS=ON, Windows 11 PRO; Dell Latitude 7450 with Intel Core Ultra 7 165U processor @15W (vPro enabled), Intel Iris Xe Graphics, VBS=ON, 32GB RAM, 512GB NVMe SSD, Microsoft Windows 11 Professional; Dell Latitude 7450 with Intel Core Ultra 7 165H processor @28W (vPro enabled), Intel Iris Xe Graphics, VBS=ON, 16GB RAM, 512GB NVMe SSD, Microsoft Windows 11 Pro. All systems were tested in Best Performance Mode. AI PC is defined as a laptop PC with a processor that includes a neural processing unit (NPU). STXP-04.
    6 Testing as of Sept 2024 by AMD performance labs on an HP EliteBook X G1a (14in) (40W) with AMD Ryzen AI 9 HX PRO 375 processor, Radeon™ 890M graphics, 32GB of RAM, 512GB SSD, VBS=ON, Windows 11 Pro vs. a Dell Latitude 7450 with an Intel Core Ultra 7 165H processor (vPro enabled), Intel Arc Graphics, VBS=ON, 16GB RAM, 512GB NVMe SSD, Microsoft Windows 11 Pro in the application(s) (Best Performance Mode): Cinebench R24 nT. Laptop manufactures may vary configurations yielding different results. STXP-12.
    7  Testing as of Sept 2024 by AMD performance labs using the following systems: (1) HP EliteBook X G1a with AMD Ryzen AI 9 HX PRO 375 processor (@40W), Radeon™ 890M graphics, 32GB of RAM, 512GB SSD, VBS=ON, Windows 11 Pro; (2) Dell Latitude 7450 with Intel Core Ultra 7 165U processor (@15W) (vPro enabled), Intel Iris Xe Graphics, VBS=ON, 32GB RAM, 512GB NVMe SSD, Microsoft Windows 11 Professional; and (3) Dell Latitude 7450 with Intel Core Ultra 7 165H processor (@28W) (vPro enabled), Intel Integrated, VBS=ON, 16GB RAM, 512GB NVMe SSD, Microsoft Windows 11 Pro. Tested applications (in Balanced Mode) include: Procyon Office Productivity, Procyon Office Productivity Excel, Procyon Office Productivity Outlook, Procyon Office Productivity Power Point, Procyon Office Productivity Word, Composite Geomean Score. Laptop manufactures may vary configurations yielding different results. STXP-18.
    8 Based on Microsoft Copilot+ requirements of minimum 40 TOPS using AMD product specifications and competitive products announced as of Oct 2024. Microsoft requirements found here – https://support.microsoft.com/en-us/topic/copilot-pc-hardware-requirements-35782169-6eab-4d63-a5c5-c498c3037364. STXP-05.
    9 Trillions of Operations per Second (TOPS) for an AMD Ryzen processor is the maximum number of operations per second that can be executed in an optimal scenario and may not be typical. TOPS may vary based on several factors, including the specific system configuration, AI model, and software version. GD-243.
    10 Testing as of Sept 2024 by AMD performance labs using the following benchmarks: Blender, Cinebench R24, Geekbench 6.3, and Passmark 11, systems: HP EliteBook X G1a with AMD Ryzen AI 9 HX PRO 375 processor @54W, Radeon 880M graphics, 32GB of RAM, 512GB SSD; Lenovo ThinkPad T14s Gen 6 with AMD Ryzen™ AI 7 PRO 360 processor @22W, Radeon™ 880M graphics, 32GB RAM, 1TB SSD; Dell Latitude 7450 with Intel Core Ultra 7 165U processor @15W (vPro enabled), Intel Iris Xe Graphics, 32GB RAM, 512GB NVMe SSD; Dell Latitude 7450 with Intel Core Ultra 7 165H processor @28W (vPro enabled), Intel Iris Xe Graphics, 16GB RAM, 512GB NVMe SSD,. All systems Windows 11 Pro, VBS=ON, and tested in Best Performance Mode. PassMark is a registered trademark of PassMark Software Pty Ltd. AI PC is defined as a laptop PC with a processor that includes a neural processing unit (NPU). STXP-07.
    11 Boost Clock Frequency is the maximum frequency achievable on the CPU running a bursty workload. Boost clock achievability, frequency, and sustainability will vary based on several factors, including but not limited to: thermal conditions and variation in applications and workloads. GD-150
    12 The AMD Secure Processor is a dedicated on-chip security processor integrated within each system-on-a-chip (SoC) and ASIC (Application Specific Integrated Circuit) designed by AMD. It enables secure boot with root of trust anchored in hardware, initializes the SoC through a secure boot flow, and establishes an isolated Trusted Execution Environment. GD-72.

    Contact:
    Stacy MacDiarmid
    AMD Communications
    +1 512-658-2265
    Stacy.MacDiarmid@amd.com

    Mitch Haws
    AMD Investor Relations
    +1 512-944-0790
    Mitch.Haws@amd.com

    A photo accompanying this announcement is available at https://www.globenewswire.com/NewsRoom/AttachmentNg/c67477ae-0d96-4936-91ba-cd836bfa321e

    The MIL Network –

    January 23, 2025
  • MIL-OSI: AMD Unveils Leadership AI Solutions at Advancing AI 2024

    Source: GlobeNewswire (MIL-OSI)

    — AMD launches 5thGen AMD EPYC processors, AMD Instinct MI325X accelerators, next gen networking solutions and AMD Ryzen AI PRO processors powering enterprise AI at scale —

    — Dell, Google Cloud, HPE, Lenovo, Meta, Microsoft, Oracle Cloud Infrastructure, Supermicro and AI leaders Databricks, Essential AI, Fireworks AI, Luma AI and Reka AI joined AMD to showcase expanding AMD AI solutions for enterprises and end users —

    — Technical leaders from Cohere, Google DeepMind, Meta, Microsoft, OpenAI and more discussed how they are using AMD ROCm software to deploy models and applications on AMD Instinct accelerators —

    SAN FRANCISCO, Oct. 10, 2024 (GLOBE NEWSWIRE) — AMD (NASDAQ: AMD) today launched the latest high performance computing solutions defining the AI computing era, including 5th Gen AMD EPYC™ server CPUs, AMD Instinct™ MI325X accelerators, AMD Pensando™ Salina DPUs, AMD Pensando Pollara 400 NICs and AMD Ryzen™ AI PRO 300 series processors for enterprise AI PCs. AMD and its partners also showcased how they are deploying AMD AI solutions at scale, the continued ecosystem growth of AMD ROCm™ open source AI software, and a broad portfolio of new solutions based on AMD Instinct accelerators, EPYC CPUs and Ryzen PRO CPUs.

    “The data center and AI represent significant growth opportunities for AMD, and we are building strong momentum for our EPYC and AMD Instinct processors across a growing set of customers,” said AMD Chair and CEO Dr. Lisa Su. “With our new EPYC CPUs, AMD Instinct GPUs and Pensando DPUs we are delivering leadership compute to power our customers’ most important and demanding workloads. Looking ahead, we see the data center AI accelerator market growing to $500 billion by 2028. We are committed to delivering open innovation at scale through our expanded silicon, software, network and cluster-level solutions.”

    Defining the Data Center in the AI Era
    AMD announced a broad portfolio of data center solutions for AI, enterprise, cloud and mixed workloads:

    • New AMD EPYC 9005 Series processors deliver record-breaking performance1 to enable optimized compute solutions for diverse data center needs. Built on the latest “Zen 5” architecture, the lineup offers up to 192 cores and will be available in a wide range of platforms from leading OEMs and ODMs starting today.
    • AMD continues executing its annual cadence of AI accelerators with the launch of AMD Instinct MI325X, delivering leadership performance and memory capabilities for the most demanding AI workloads. AMD also shared new details on next-gen AMD Instinct MI350 series accelerators expected to launch in the second half of 2025, extending AMD Instinct leadership memory capacity and generative AI performance. AMD has made significant progress developing the AMD Instinct MI400 Series accelerators based on the AMD CDNA Next architecture, planned to be available in 2026.
    • AMD has continuously improved its AMD ROCm software stack, doubling AMD Instinct MI300X accelerator inferencing and training performance2 across a wide range of the most popular AI models. Today, over one million models run seamlessly out of the box on AMD Instinct, triple the number available when MI300X launched, with day-zero support for the most widely used models.
    • AMD also expanded its high performance networking portfolio to address evolving system networking requirements for AI infrastructure, maximizing CPU and GPU performance to deliver performance, scalability and efficiency across the entire system. The AMD Pensando Salina DPU delivers a high performance front-end network for AI systems, while the AMD Pensando Pollara 400, the first Ultra Ethernet Consortium ready NIC, reduces the complexity of performance tuning and helps improve time to production.

    AMD partners detailed how they leverage AMD data center solutions to drive leadership generative AI capabilities, deliver cloud infrastructure used by millions of people daily and power on-prem and hybrid data centers for leading enterprises:

    • Since launching in December 2023, AMD Instinct MI300X accelerators have been deployed at scale by leading cloud, OEM and ODM partners and are serving millions of users daily on popular AI models, including OpenAI’s ChatGPT, Meta Llama and over one million open source models on the Hugging Face platform.
    • Google highlighted how AMD EPYC processors power a wide range of instances for AI, high performance, general purpose and confidential computing, including their AI Hypercomputer, a supercomputing architecture designed to maximize AI ROI. Google also announced EPYC 9005 Series-based VMs will be available in early 2025.
    • Oracle Cloud Infrastructure shared how it leverages AMD EPYC CPUs, AMD Instinct accelerators and Pensando DPUs to deliver fast, energy efficient compute and networking infrastructure for customers like Uber, Red Bull Powertrains, PayPal and Fireworks AI. OCI announced the new E6 compute platform powered by EPYC 9005 processors.
    • Databricks highlighted how its models and workflows run seamlessly on AMD Instinct and ROCm and disclosed that their testing shows the large memory capacity and compute capabilities of AMD Instinct MI300X GPUs help deliver an over 50% increase in performance on Llama and Databricks proprietary models.
    • Microsoft CEO Satya Nadella highlighted Microsoft’s longstanding collaboration and co-innovation with AMD across its product offerings and infrastructure, with MI300X delivering strong performance on Microsoft Azure and GPT workloads. Nadella and Su also discussed the companies’ deep partnership on the AMD Instinct roadmap and how Microsoft is planning to leverage future generations of AMD Instinct accelerators including MI350 series and beyond to deliver leadership performance-per-dollar-per-watt for AI applications.
    • Meta detailed how AMD EPYC CPUs and AMD Instinct accelerators power its compute infrastructure across AI deployments and services, with MI300X serving all live traffic on Llama 405B. Meta is also partnering with AMD to optimize AI performance from silicon, systems, and networking to software and applications.
    • Leading OEMs Dell, HPE, Lenovo and Supermicro are expanding on their highly performant, energy efficient AMD EPYC processor-based lineups with new platforms designed to modernize data centers for the AI era.

    Expanding an Open AI Ecosystem
    AMD continues to invest in the open AI ecosystem and expand the AMD ROCm open source software stack with new features, tools, optimizations and support to help developers extract the ultimate performance from AMD Instinct accelerators and deliver out-of-the-box support for today’s leading AI models. Leaders from Essential AI, Fireworks AI, Luma AI and Reka AI discussed how they are optimizing models across AMD hardware and software.

    AMD also hosted a developer event joined by technical leaders from across the AI developer ecosystem, including Microsoft, OpenAI, Meta, Cohere, xAI and more. Luminary presentations hosted by the inventors of popular AI programming languages, models and frameworks critical to the AI transformation taking place, such as Triton, TensorFlow, vLLM and Paged Attention, FastChat and more, shared how developers are unlocking AI performance optimizations through vendor agnostic programming languages, accelerating models on AMD Instinct accelerators, and highlighted the ease of use porting to ROCm software and how the ecosystem is benefiting from an open-source approach.

    Enabling Enterprise Productivity with AI PCs
    AMD launched AMD Ryzen AI PRO 300 Series processors, powering the first Microsoft Copilot+ laptops enabled for the enterprise3. The Ryzen AI PRO 300 Series processor lineup extends AMD leadership in performance and battery life with the addition of enterprise-grade security and manageability features for business users.

    • The Ryzen AI PRO 300 Series processors, featuring the new AMD “Zen 5” and AMD XDNA™ 2 architectures, are the world’s most advanced commercial processors4, offering best in class performance for unmatched productivity5 and an industry leading 55 NPU TOPS6 of AI performance with the Ryzen AI 9 HX PRO 375 processor to process AI tasks locally on Ryzen AI PRO laptops.
    • Microsoft highlighted how Windows 11 Copilot+ and the Ryzen AI PRO 300 lineup are ready for next generation AI experiences, including new productivity and security features.
    • OEM partners including HP and Lenovo are expanding their commercial offerings with new PCs powered by Ryzen AI PRO 300 Series processors, with more than 100 platforms expected to come to market through 2025.

    Supporting Resources

    • Watch the AMD Advancing AI keynote and see the news here
    • Follow AMD on X
    • Connect with AMD on LinkedIn

    About AMD
    For more than 50 years AMD has driven innovation in high-performance computing, graphics, and visualization technologies. Billions of people, leading Fortune 500 businesses, and cutting-edge scientific research institutions around the world rely on AMD technology daily to improve how they live, work, and play. AMD employees are focused on building leadership high-performance and adaptive products that push the boundaries of what is possible. For more information about how AMD is enabling today and inspiring tomorrow, visit the AMD (NASDAQ: AMD) website, blog, LinkedIn, and X pages.

    Cautionary Statement
    This press release contains forward-looking statements concerning Advanced Micro Devices, Inc. (AMD) such as the features, functionality, performance, availability, timing and expected benefits of AMD products; AMD’s expected data center and AI growth opportunities; the ability of AMD to build momentum for AMD EPYC™ and AMD Instinct™ processors across its customers; the ability of AMD to deliver leadership compute to power to its customers workloads; the anticipated growth of the data center AI accelerator market by 2028; and AMD’s commitment to delivering open innovation at scale, which are made pursuant to the Safe Harbor provisions of the Private Securities Litigation Reform Act of 1995. Forward-looking statements are commonly identified by words such as “would,” “may,” “expects,” “believes,” “plans,” “intends,” “projects” and other terms with similar meaning. Investors are cautioned that the forward-looking statements in this press release are based on current beliefs, assumptions and expectations, speak only as of the date of this press release and involve risks and uncertainties that could cause actual results to differ materially from current expectations. Such statements are subject to certain known and unknown risks and uncertainties, many of which are difficult to predict and generally beyond AMD’s control, that could cause actual results and other future events to differ materially from those expressed in, or implied or projected by, the forward-looking information and statements. Material factors that could cause actual results to differ materially from current expectations include, without limitation, the following: Intel Corporation’s dominance of the microprocessor market and its aggressive business practices; Nvidia’s dominance in the graphics processing unit market and its aggressive business practices; the cyclical nature of the semiconductor industry; market conditions of the industries in which AMD products are sold; loss of a significant customer; competitive markets in which AMD’s products are sold; economic and market uncertainty; quarterly and seasonal sales patterns; AMD’s ability to adequately protect its technology or other intellectual property; unfavorable currency exchange rate fluctuations; ability of third party manufacturers to manufacture AMD’s products on a timely basis in sufficient quantities and using competitive technologies; availability of essential equipment, materials, substrates or manufacturing processes; ability to achieve expected manufacturing yields for AMD’s products; AMD’s ability to introduce products on a timely basis with expected features and performance levels; AMD’s ability to generate revenue from its semi-custom SoC products; potential security vulnerabilities; potential security incidents including IT outages, data loss, data breaches and cyberattacks; uncertainties involving the ordering and shipment of AMD’s products; AMD’s reliance on third-party intellectual property to design and introduce new products; AMD’s reliance on third-party companies for design, manufacture and supply of motherboards, software, memory and other computer platform components; AMD’s reliance on Microsoft and other software vendors’ support to design and develop software to run on AMD’s products; AMD’s reliance on third-party distributors and add-in-board partners; impact of modification or interruption of AMD’s internal business processes and information systems; compatibility of AMD’s products with some or all industry-standard software and hardware; costs related to defective products; efficiency of AMD’s supply chain; AMD’s ability to rely on third party supply-chain logistics functions; AMD’s ability to effectively control sales of its products on the gray market; long-term impact of climate change on AMD’s business; impact of government actions and regulations such as export regulations, tariffs and trade protection measures; AMD’s ability to realize its deferred tax assets; potential tax liabilities; current and future claims and litigation; impact of environmental laws, conflict minerals related provisions and other laws or regulations; evolving expectations from governments, investors, customers and other stakeholders regarding corporate responsibility matters; issues related to the responsible use of AI; restrictions imposed by agreements governing AMD’s notes, the guarantees of Xilinx’s notes and the revolving credit agreement; impact of acquisitions, joint ventures and/or investments on AMD’s business and AMD’s ability to integrate acquired businesses;  impact of any impairment of the combined company’s assets; political, legal and economic risks and natural disasters; future impairments of technology license purchases; AMD’s ability to attract and retain qualified personnel; and AMD’s stock price volatility. Investors are urged to review in detail the risks and uncertainties in AMD’s Securities and Exchange Commission filings, including but not limited to AMD’s most recent reports on Forms 10-K and 10-Q.

    AMD, the AMD Arrow logo, EPYC, AMD CDNA, AMD Instinct, Pensando, ROCm, Ryzen, and combinations thereof are trademarks of Advanced Micro Devices, Inc. Other names are for informational purposes only and may be trademarks of their respective owners.

    __________________________________

    1 EPYC-022F: For a complete list of world records see: http://amd.com/worldrecords.
    2 Testing conducted by internal AMD Performance Labs as of September 29, 2024 inference performance comparison between ROCm 6.2 software and ROCm 6.0 software on the systems with 8 AMD Instinct™ MI300X GPUs coupled with Llama 3.1-8B, Llama 3.1-70B, Mixtral-8x7B, Mixtral-8x22B, and Qwen 72B models.
    ROCm 6.2 with vLLM 0.5.5 performance was measured against the performance with ROCm 6.0 with vLLM 0.3.3, and tests were performed across batch sizes of 1 to 256 and sequence lengths of 128 to 2048.
    Configurations:
    1P AMD EPYC™ 9534 CPU server with 8x AMD Instinct™ MI300X (192GB, 750W) GPUs, Supermicro AS-8125GS-TNMR2, NPS1 (1 NUMA per socket), 1.5 TiB (24 DIMMs, 4800 mts memory, 64 GiB/DIMM), 4x 3.49TB Micron 7450 storage, BIOS version: 1.8, , ROCm 6.2.0-00, vLLM 0.5.5, PyTorch 2.4.0, Ubuntu® 22.04 LTS with Linux kernel 5.15.0-119-generic.
    vs.
    1P AMD EPYC 9534 CPU server with 8x AMD Instinct™ MI300X (192GB, 750W) GPUs, Supermicro AS-8125GS-TNMR2, NPS1 (1 NUMA per socket), 1.5TiB 24 DIMMS, 4800 mts memory, 64 GiB/DIMM), 4x 3.49TB Micron 7450 storage, BIOS version: 1.8, ROCm 6.0.0-00, vLLM 0.3.3, PyTorch 2.1.1, Ubuntu 22.04 LTS with Linux kernel 5.15.0-119-generic. MI300-62
    Server manufacturers may vary configurations, yielding different results. Performance may vary based on factors including but not limited to different versions of configurations, vLLM, and drivers.
    3 Based on Microsoft Copilot+ requirements of minimum 40 TOPS using AMD product specifications and competitive products announced as of Oct 2024. Microsoft requirements found here – https://support.microsoft.com/en-us/topic/copilot-pc-hardware-requirements-35782169-6eab-4d63-a5c5-c498c3037364. STXP-05.
    4 Based on a small node size for an x86 platform and cutting-edge, interconnected technologies, as of September 2024. GD-203b
    5 Testing as of Sept 2024 by AMD performance labs using the following systems: HP EliteBook X G1a with AMD Ryzen AI 9 HX PRO 375 processor @40W, Radeon™ 890M graphics, 32GB of RAM, 512GB SSD, VBS=ON, Windows 11 Pro; Lenovo ThinkPad T14s Gen 6 with AMD Ryzen™ AI 7 PRO 360 processor @22W, Radeon™ 880M graphics, 32GB RAM, 1TB SSD, VBS=ON, Windows 11 Pro; Dell Latitude 7450 with Intel Core Ultra 7 165U processor @15W (vPro enabled), Intel Iris Xe Graphics, VBS=ON, 32GB RAM, 512GB NVMe SSD, Microsoft Windows 11 Professional; Dell Latitude 7450 with Intel Core Ultra 7 165H processor @28W (vPro enabled), Intel Iris Xe Graphics, VBS=ON, 16GB RAM, 512GB NVMe SSD, Microsoft Windows 11 Pro. The following applications were tested in Balanced Mode: Teams + Procyon Office Productivity, Teams + Procyon Office Productivity Excel, Teams + Procyon Office Productivity Outlook, Teams + Procyon Office Productivity Power Point, Teams + Procyon Office Productivity Word, Composite Geomean Score. Each Microsoft Teams call consists of 9 participants (3X3). Laptop manufactures may vary configurations yielding different results. STXP-10.
    Testing as of Sept 2024 by AMD performance labs using the following systems: (1) Lenovo ThinkPad T14s Gen 6 with an AMD Ryzen™ AI 7 PRO 360 processor (@22W), Radeon™ 880M graphics, 32GB RAM, 1TB SSD, VBS=ON, Windows 11 Pro; (2) Dell Latitude 7450 with Intel Core Ultra 7 165U processor (@15W) (vPro enabled), Intel Iris Xe Graphics, VBS=ON, 32GB RAM, 512GB NVMe SSD, Microsoft Windows 11 Professional; and (3) Dell Latitude 7450 with Intel Core Ultra 7 165H processor (@28W) (vPro enabled), Intel Arc Graphics, VBS=ON, 16GB RAM, 512GB NVMe SSD, Microsoft Windows 11 Pro. Tested applications (in Balanced Mode) include: Procyon Office Productivity, Procyon Office Productivity Excel, Procyon Office Productivity Outlook, Procyon Office Productivity Power Point, Procyon Office Productivity Word, Composite Geomean Score. Laptop manufactures may vary configurations yielding different results. STXP-11.
    6 Trillions of Operations per Second (TOPS) for an AMD Ryzen processor is the maximum number of operations per second that can be executed in an optimal scenario and may not be typical. TOPS may vary based on several factors, including the specific system configuration, AI model, and software version. GD-243.

    Media Contacts:
    Brandi Martina
    AMD Communications
    +1 512-705-1720 
    brandi.martina@amd.com

    Mitch Haws
    AMD Investor Relations
    +1 512-944-0790
    mitch.haws@amd.com

    The MIL Network –

    January 23, 2025
  • MIL-OSI: AMD Launches 5th Gen AMD EPYC CPUs, Maintaining Leadership Performance and Features for the Modern Data Center

    Source: GlobeNewswire (MIL-OSI)

    — New EPYC processors deliver record breaking performance and efficiency for a wide range of data center workloads —

    — AMD EPYC CPUs continue momentum, with more than 950 AMD EPYC-powered public instances available globally and more than 350 platforms from OxMs —

    SAN FRANCISCO, Oct. 10, 2024 (GLOBE NEWSWIRE) — AMD (NASDAQ: AMD) today announced the availability of the 5th Gen AMD EPYC™ processors, formerly codenamed “Turin,” the world’s best server CPU for enterprise, AI and cloud1.

    Using the “Zen 5” core architecture, compatible with the broadly deployed SP5 platform2 and offering a broad range of core counts spanning from 8 to 192, the AMD EPYC 9005 Series processors extend the record-breaking performance3 and energy efficiency of the previous generations with the top of stack 192 core CPU delivering up to 2.7X the performance4 compared to the competition.

    New to the AMD EPYC 9005 Series CPUs is the 64 core AMD EPYC 9575F, tailor made for GPU powered AI solutions that need the ultimate in host CPU capabilities. Boosting up to 5GHz5, compared to the 3.8GHz processor of the competition, it provides up to 28% faster processing needed to keep GPUs fed with data for demanding AI workloads.

    “From powering the world’s fastest supercomputers, to leading enterprises, to the largest Hyperscalers, AMD has earned the trust of customers who value demonstrated performance, innovation and energy efficiency,” said Dan McNamara, senior vice president and general manager, server business, AMD. “With five generations of on-time roadmap execution, AMD has proven it can meet the needs of the data center market and give customers the standard for data center performance, efficiency, solutions and capabilities for cloud, enterprise and AI workloads.”

    The World’s Best CPU for Enterprise, AI and Cloud Workloads

    Modern data centers run a variety of workloads, from supporting corporate AI-enablement initiatives, to powering large-scale cloud-based infrastructures to hosting the most demanding business-critical applications. The new 5th Gen AMD EPYC processors provide leading performance and capabilities for the broad spectrum of server workloads driving business IT today.

    The new “Zen 5” core architecture, provides up to 17% better instructions per clock (IPC) for enterprise and cloud workloads and up to 37% higher IPC in AI and high performance computing (HPC) compared to “Zen 4.”6

    With AMD EPYC 9965 processor-based servers, customers can expect significant impact in their real world applications and workloads compared to the Intel Xeon® 8592+ CPU-based servers, with:

    • Up to 4X faster time to results on business applications such as video transcoding.7
    • Up to 3.9X the time to insights for science and HPC applications that solve the world’s most challenging problems.8
    • Up to 1.6X the performance per core in virtualized infrastructure.9

    In addition to leadership performance and efficiency in general purpose workloads, 5th Gen AMD EPYC processors enable customers to drive fast time to insights and deployments for AI deployments, whether they are running a CPU or a CPU + GPU solution.

    Compared to the competition:

    • The 192 core EPYC 9965 CPU has up to 3.7X the performance on end-to-end AI workloads, like TPCx-AI (derivative), which are critical for driving an efficient approach to generative AI.10
    • In small and medium size enterprise-class generative AI models, like Meta’s Llama 3.1-8B, the EPYC 9965 provides 1.9X the throughput performance compared to the competition.11
    • Finally, the purpose built AI host node CPU, the EPYC 9575F, can use its 5GHz max frequency boost to help a 1,000 node AI cluster drive up to 700,000 more inference tokens per second. Accomplishing more, faster.12

    By modernizing to a data center powered by these new processors to achieve 391,000 units of SPECrate®2017_int_base general purpose computing performance, customers receive impressive performance for various workloads, while gaining the ability to use an estimated 71% less power and ~87% fewer servers13. This gives CIOs the flexibility to either benefit from the space and power savings or add performance for day-to-day IT tasks while delivering impressive AI performance.

    AMD EPYC CPUs – Driving Next Wave of Innovation
    The proven performance and deep ecosystem support across partners and customers have driven widespread adoption of EPYC CPUs to power the most demanding computing tasks. With leading performance, features and density, AMD EPYC CPUs help customers drive value in their data centers and IT environments quickly and efficiently.

    5thGen AMD EPYC Features
    The entire lineup of 5th Gen AMD EPYC processors is available today, with support from Cisco, Dell, Hewlett Packard Enterprise, Lenovo and Supermicro as well as all major ODMs and cloud service providers providing a simple upgrade path for organizations seeking compute and AI leadership.

    High level features of the AMD EPYC 9005 series CPUs include:

    • Leadership core count options from 8 to 192, per CPU
    • “Zen 5” and “Zen 5c” core architectures
    • 12 channels of DDR5 memory per CPU
    • Support for up to DDR5-6400 MT/s14
    • Leadership boost frequencies up to 5GHz5
    • AVX-512 with the full 512b data path
    • Trusted I/O for Confidential Computing, and FIPS certification in process for every part in the series
    Model
    (AMD EPYC)
    Cores CCD
    (Zen5/Zen5c)
    Base/Boost5
    (up to GHz)
    Default
    TDP (W)
    L3 Cache
    (MB)
    Price
    (1 KU, USD)
    9965 192 cores “Zen5c” 2.25 / 3.7 500W 384 $14,813
    9845 160 cores “Zen5c” 2.1 / 3.7 390W 320 $13,564
    9825 144 cores “Zen5c” 2.2 / 3.7 390W 384 $13,006
    9755
    9745
    128 cores “Zen5”
    “Zen5c”
    2.7 / 4.1
    2.4 / 3.7
    500W
    400W
    512
    256
    $12,984
    $12,141
    9655
    9655P
    9645
    96 cores “Zen5”
    “Zen5”
    “Zen5c”
    2.6 / 4.5
    2.6 / 4.5
    2.3 / 3.7
    400W
    400W
    320W
    384
    384
    384
    $11,852
    $10,811
    $11,048
    9565 72 cores “Zen5” 3.15 / 4.3 400W 384 $10,486
    9575F
    9555
    9555P
    9535
    64 cores “Zen5”
    “Zen5”
    “Zen5”
    “Zen5”
    3.3 / 5.0
    3.2 / 4.4
    3.2 / 4.4
    2.4 / 4.3
    400W
    360W
    360W
    300W
    256
    256
    256
    256
    $11,791
    $9,826
    $7,983
    $8,992
    9475F
    9455
    9455P
    48 cores “Zen5”
    “Zen5”
    “Zen5”
    3.65 / 4.8
    3.15 / 4.4
    3.15 / 4.4
    400W
    300W
    300W
    256
    192
    192
    $7,592
    $5,412
    $4,819
    9365 36 cores “Zen5” 3.4 / 4.3 300W 256 $4,341
    9375F
    9355
    9355P
    9335
    32 cores “Zen5”
    “Zen5”
    “Zen5”
    “Zen5”
    3.8 / 4.8
    3.55 / 4.4
    3.55 / 4.4
    3.0 / 4.4
    320W
    280W
    280W
    210W
    256
    256
    256
    256
    $5,306
    $3,694
    $2,998
    $3,178
    9275F
    9255
    24 cores “Zen5”
    “Zen5”
    4.1 / 4.8
    3.25 / 4.3
    320W
    200W
    256
    128
    $3,439
    $2,495
    9175F
    9135
    9115
    16 cores “Zen5”
    “Zen5”
    “Zen5”
    4.2 / 5.0
    3.65 / 4.3
    2.6 / 4.1
    320W
    200W
    125W
    512
    64
    64
    $4,256
    $1,214
    $726
    9015 8 cores “Zen5” 3.6 / 4.1 125W 64 $527

    Supporting Resources

    About AMD
    For more than 50 years AMD has driven innovation in high-performance computing, graphics, and visualization technologies. Billions of people, leading Fortune 500 businesses, and cutting-edge scientific research institutions around the world rely on AMD technology daily to improve how they live, work, and play. AMD employees are focused on building leadership high-performance and adaptive products that push the boundaries of what is possible. For more information about how AMD is enabling today and inspiring tomorrow, visit the AMD (NASDAQ: AMD) website, blog, LinkedIn and X pages.

    Cautionary Statement
    This press release contains forward-looking statements concerning Advanced Micro Devices, Inc. (AMD) such as the features, functionality, performance, availability, timing and expected benefits of AMD products including AMD EPYC™ processors, which are made pursuant to the Safe Harbor provisions of the Private Securities Litigation Reform Act of 1995. Forward-looking statements are commonly identified by words such as “would,” “may,” “expects,” “believes,” “plans,” “intends,” “projects” and other terms with similar meaning. Investors are cautioned that the forward-looking statements in this press release are based on current beliefs, assumptions and expectations, speak only as of the date of this press release and involve risks and uncertainties that could cause actual results to differ materially from current expectations. Such statements are subject to certain known and unknown risks and uncertainties, many of which are difficult to predict and generally beyond AMD’s control, that could cause actual results and other future events to differ materially from those expressed in, or implied or projected by, the forward-looking information and statements. Material factors that could cause actual results to differ materially from current expectations include, without limitation, the following: Intel Corporation’s dominance of the microprocessor market and its aggressive business practices; Nvidia’s dominance in the graphics processing unit market and its aggressive business practices; the cyclical nature of the semiconductor industry; market conditions of the industries in which AMD products are sold; loss of a significant customer; competitive markets in which AMD’s products are sold; economic and market uncertainty; quarterly and seasonal sales patterns; AMD’s ability to adequately protect its technology or other intellectual property; unfavorable currency exchange rate fluctuations; ability of third party manufacturers to manufacture AMD’s products on a timely basis in sufficient quantities and using competitive technologies; availability of essential equipment, materials, substrates or manufacturing processes; ability to achieve expected manufacturing yields for AMD’s products; AMD’s ability to introduce products on a timely basis with expected features and performance levels; AMD’s ability to generate revenue from its semi-custom SoC products; potential security vulnerabilities; potential security incidents including IT outages, data loss, data breaches and cyberattacks; uncertainties involving the ordering and shipment of AMD’s products; AMD’s reliance on third-party intellectual property to design and introduce new products; AMD’s reliance on third-party companies for design, manufacture and supply of motherboards, software, memory and other computer platform components; AMD’s reliance on Microsoft and other software vendors’ support to design and develop software to run on AMD’s products; AMD’s reliance on third-party distributors and add-in-board partners; impact of modification or interruption of AMD’s internal business processes and information systems; compatibility of AMD’s products with some or all industry-standard software and hardware; costs related to defective products; efficiency of AMD’s supply chain; AMD’s ability to rely on third party supply-chain logistics functions; AMD’s ability to effectively control sales of its products on the gray market; long-term impact of climate change on AMD’s business; impact of government actions and regulations such as export regulations, tariffs and trade protection measures; AMD’s ability to realize its deferred tax assets; potential tax liabilities; current and future claims and litigation; impact of environmental laws, conflict minerals related provisions and other laws or regulations; evolving expectations from governments, investors, customers and other stakeholders regarding corporate responsibility matters; issues related to the responsible use of AI; restrictions imposed by agreements governing AMD’s notes, the guarantees of Xilinx’s notes and the revolving credit agreement; impact of acquisitions, joint ventures and/or investments on AMD’s business and AMD’s ability to integrate acquired businesses;  impact of any impairment of the combined company’s assets; political, legal and economic risks and natural disasters; future impairments of technology license purchases; AMD’s ability to attract and retain qualified personnel; and AMD’s stock price volatility. Investors are urged to review in detail the risks and uncertainties in AMD’s Securities and Exchange Commission filings, including but not limited to AMD’s most recent reports on Forms 10-K and 10-Q.

    AMD, the AMD Arrow logo, EPYC and combinations thereof are trademarks of Advanced Micro Devices, Inc. Other names are for informational purposes only and may be trademarks of their respective owners.

    1 EPYC-029C: Comparison based on thread density, performance, features, process technology and built-in security features of currently shipping servers as of 10/10/2024. EPYC 9005 series CPUs offer the highest thread density [EPYC-025B], leads the industry with 500+ performance world records [EPYC-023F] with performance world record enterprise leadership Java® ops/sec performance [EPYCWR-20241010-260], top HPC leadership with floating-point throughput performance [EPYCWR-2024-1010-381], AI end-to-end performance with TPCx-AI performance [EPYCWR-2024-1010-525] and highest energy efficiency scores [EPYCWR-20241010-326]. The 5th Gen EPYC series also has 50% more DDR5 memory channels [EPYC-033C] with 70% more memory bandwidth [EPYC-032C] and supports 70% more PCIe® Gen5 lanes for I/O throughput [EPYC-035C], has up to 5x the L3 cache/core [EPYC-043C] for faster data access, uses advanced 3-4nm technology, and offers Secure Memory Encryption + Secure Encrypted Virtualization (SEV) + SEV Encrypted State + SEV-Secure Nested Paging security features. See the AMD EPYC Architecture White Paper (https://library.amd.com/l/3f4587d147382e2/) for more information.

    2 AMD EPYC™ 9005 processors utilize the SP5 socket. Many factors determine system compatibility. Check with your server manufacturer to determine if this processor is supported in systems configured with previously launched AMD EPYC 9004 family CPUs.

    3 EPYC-022F: For a complete list of world records see: http://amd.com/worldrecords.

    4 9xx5-002C: SPECrate®2017_int_base comparison based on published scores from http://www.spec.org as of 10/10/2024.

    2P AMD EPYC 9965 (3000 SPECrate®2017_int_base, 384 Total Cores, 500W TDP, $14,813 CPU $), 6.060 SPECrate®2017_int_base/CPU W, 0.205 SPECrate®2017_int_base/CPU $, https://www.spec.org/cpu2017/results/res2024q3/cpu2017-20240923-44833.html)

    2P AMD EPYC 9755 (2720 SPECrate®2017_int_base, 256 Total Cores, 500W TDP, $12,984 CPU $), 5.440 SPECrate®2017_int_base/CPU W, 0.209 SPECrate®2017_int_base/CPU $, https://www.spec.org/cpu2017/results/res2024q4/cpu2017-20240923-44837.pdf)

    2P AMD EPYC 9754 (1950 SPECrate®2017_int_base, 256 Total Cores, 360W TDP, $11,900 CPU $), 5.417 SPECrate®2017_int_base/CPU W, 0.164 SPECrate®2017_int_base/CPU $, https://www.spec.org/cpu2017/results/res2023q2/cpu2017-20230522-36617.html)

    2P AMD EPYC 9654 (1810 SPECrate®2017_int_base, 192 Total Cores, 360W TDP, $11,805 CPU $), 5.028 SPECrate®2017_int_base/CPU W, 0.153 SPECrate®2017_int_base/CPU $, https://www.spec.org/cpu2017/results/res2024q1/cpu2017-20240129-40896.html)

    2P Intel Xeon Platinum 8592+ (1130 SPECrate®2017_int_base, 128 Total Cores, 350W TDP, $11,600 CPU $) 3.229 SPECrate®2017_int_base/CPU W, 0.097 SPECrate®2017_int_base/CPU $, http://spec.org/cpu2017/results/res2023q4/cpu2017-20231127-40064.html)

    2P Intel Xeon 6780E (1410 SPECrate®2017_int_base, 288 Total Cores, 330W TDP, $11,350 CPU $) 4.273 SPECrate®2017_int_base/CPU W, 0.124 SPECrate®2017_int_base/CPU $, https://spec.org/cpu2017/results/res2024q3/cpu2017-20240811-44406.html)

    SPEC®, SPEC CPU®, and SPECrate® are registered trademarks of the Standard Performance Evaluation Corporation. See http://www.spec.org for more information. Intel CPU TDP at https://ark.intel.com/.

    5 GD-150: Boost Clock Frequency is the maximum frequency achievable on the CPU running a bursty workload. Boost clock achievability, frequency, and sustainability will vary based on several factors, including but not limited to: thermal conditions and variation in applications and workloads. GD-150.

    6 9xx5-001: Based on AMD internal testing as of 9/10/2024, geomean performance improvement (IPC) at fixed-frequency.

    – 5th Gen EPYC CPU Enterprise and Cloud Server Workloads generational IPC Uplift of 1.170x (geomean) using a select set of 36 workloads and is the geomean of estimated scores for total and all subsets of SPECrate®2017_int_base (geomean ), estimated scores for total and all subsets of SPECrate®2017_fp_base (geomean), scores for Server Side Java multi instance max ops/sec, representative Cloud Server workloads (geomean), and representative Enterprise server workloads (geomean).

    “Genoa” Config (all NPS1): EPYC 9654 BIOS TQZ1005D 12c12t (1c1t/CCD in 12+1), FF 3GHz, 12x DDR5-4800 (2Rx4 64GB), 32Gbps xGMI;

    “Turin” config (all NPS1): EPYC 9V45 BIOS RVOT1000F 12c12t (1c1t/CCD in 12+1), FF 3GHz, 12x DDR5-6000 (2Rx4 64GB), 32Gbps xGMI

    Utilizing Performance Determinism and the Performance governor on Ubuntu® 22.04 w/ 6.8.0-40-generic kernel OS for all workloads.

    – 5th Gen EPYC generational ML/HPC Server Workloads IPC Uplift of 1.369x (geomean) using a select set of 24 workloads and is the geomean of representative ML Server Workloads (geomean), and representative HPC Server Workloads (geomean).

    “Genoa” Config (all NPS1) “Genoa” config: EPYC 9654 BIOS TQZ1005D 12c12t (1c1t/CCD in 12+1), FF 3GHz, 12x DDR5-4800 (2Rx4 64GB), 32Gbps xGMI;

    “Turin” config (all NPS1): EPYC 9V45 BIOS RVOT1000F 12c12t (1c1t/CCD in 12+1), FF 3GHz, 12x DDR5-6000 (2Rx4 64GB), 32Gbps xGMI

    Utilizing Performance Determinism and the Performance governor on Ubuntu 22.04 w/ 6.8.0-40-generic kernel OS for all workloads except LAMMPS, HPCG, NAMD, OpenFOAM, Gromacs which utilize 24.04 w/ 6.8.0-40-generic kernel.

    SPEC® and SPECrate® are registered trademarks for Standard Performance Evaluation Corporation. Learn more at spec.org.

    7 9xx5-006: AMD internal testing as of 09/01/2024, on FFMPEG (Raw to VP9, 1080P, 302 Frames, 1 instance/thread, video source: https://media.xiph.org/video/derf/y4m/ducks_take_off_1080p50.y4m).

    System Configurations: 2P AMD EPYC™ 9965 reference system (2 x 192C) 1.5TB 24x64GB DDR5-6400 running at 6000MT/s, SAMSUNG MZWLO3T8HCLS-00A07, NPS=4, Ubuntu 22.04.3 LTS, Kernel Linux 5.15.0-119-generic, BIOS RVOT1000C (determinism enable=power), 10825484.25 Frames/Hour Median

    2P AMD EPYC™ 9654 production system (2 x 96C) 1.5TB 24x64GB DDR5-5600, , SAMSUNG MO003200KYDNC, NPS=4, Ubuntu 22.04.3 LTS, Kernel Linux 5.15.0-119-generic, BIOS 1.56 (determinism enable=power) , 5154133.333 Frames/Hour Median

    2P Intel Xeon Platinum 8592+ production system (2 x 64C) 1TB 16x64GB DDR5-5600, 3.2 TB NVME, Ubuntu 22.04.3 LTS, Kernel Linux 6.5.0-35-generic), BIOS ESE122V-3.10, 2712701.754 Frames/Hour Median

    For 3.99x the performance with the AMD EPYC 9965 vs Intel Xeon Platinum 8592+ systems

    For 1.90x the performance with the AMD EPYC 9654 vs Intel Xeon Platinum 8592+ systems

    Results may vary based on factors including but not limited to BIOS and OS settings and versions, software versions and data used.

    8 9xx5-022: Source: https://www.amd.com/content/dam/amd/en/documents/epyc-technical-docs/performance-briefs/amd-epyc-9005-pb-gromacs.pdf

    9 9xx5-071: VMmark® 4.0.1 host/node FC SAN comparison based on “independently published” results as of 10/10/2024.  
    Configurations:

    2 node, 2P AMD EPYC 9575F (128 total cores) powered server running VMware ESXi8.0 U3, 3.31 @ 4 tiles,
    https://www.infobellit.com/BlueBookSeries/VMmark4-FDR-1003

    2 node, 2P AMD EPYC 9554 (128 total cores) powered server running VMware ESXi 8.0 U3, 2.64 @ 3 tiles,
    https://www.infobellit.com/BlueBookSeries/VMmark4-FDR-1002

    2 node, 2P Intel Xeon Platinum 8592+ (128 total cores) powered server running VMware ESXi 8.0 U3, 2.06 @ 2.4 Tiles,
    https://www.infobellit.com/BlueBookSeries/VMmark4-FDR-1001

    VMmark is a registered trademark of VMware in the US or other countries.

    10 9xx5-012: TPCxAI @SF30 Multi-Instance 32C Instance Size throughput results based on AMD internal testing as of 09/05/2024 running multiple VM instances. The aggregate end-to-end AI throughput test is derived from the TPCx-AI benchmark and as such is not comparable to published TPCx-AI results, as the end-to-end AI throughput test results do not comply with the TPCx-AI Specification.

    2P AMD EPYC 9965 (384 Total Cores), 12 32C instances, NPS1, 1.5TB 24x64GB DDR5-6400 (at 6000 MT/s), 1DPC, 1.0 Gbps NetXtreme BCM5720 Gigabit Ethernet PCIe, 3.5 TB Samsung MZWLO3T8HCLS-00A07 NVMe®, Ubuntu® 22.04.4 LTS, 6.8.0-40-generic (tuned-adm profile throughput-performance, ulimit -l 198096812, ulimit -n 1024, ulimit -s 8192), BIOS RVOT1000C (SMT=off, Determinism=Power, Turbo Boost=Enabled)

    2P AMD EPYC 9755 (256 Total Cores), 8 32C instances, NPS1, 1.5TB 24x64GB DDR5-6400 (at 6000 MT/s), 1DPC, 1.0 Gbps NetXtreme BCM5720 Gigabit Ethernet PCIe, 3.5 TB Samsung MZWLO3T8HCLS-00A07 NVMe®, Ubuntu 22.04.4 LTS, 6.8.0-40-generic (tuned-adm profile throughput-performance, ulimit -l 198096812, ulimit -n 1024, ulimit -s 8192), BIOS RVOT0090F (SMT=off, Determinism=Power, Turbo Boost=Enabled)

    2P AMD EPYC 9654 (192 Total cores) 6 32C instances, NPS1, 1.5TB 24x64GB DDR5-4800, 1DPC, 2 x 1.92 TB Samsung MZQL21T9HCJR-00A07 NVMe, Ubuntu 22.04.3 LTS, BIOS 1006C (SMT=off, Determinism=Power)

    Versus 2P Xeon Platinum 8592+ (128 Total Cores), 4 32C instances, AMX On, 1TB 16x64GB DDR5-5600, 1DPC, 1.0 Gbps NetXtreme BCM5719 Gigabit Ethernet PCIe, 3.84 TB KIOXIA KCMYXRUG3T84 NVMe, , Ubuntu 22.04.4 LTS, 6.5.0-35 generic (tuned-adm profile throughput-performance, ulimit -l 132065548, ulimit -n 1024, ulimit -s 8192), BIOS ESE122V (SMT=off, Determinism=Power, Turbo Boost = Enabled)

    Results:

    CPU Median Relative Generational
    Turin 192C, 12 Inst 6067.531 3.775 2.278
    Turin 128C, 8 Inst 4091.85 2.546 1.536
    Genoa 96C, 6 Inst 2663.14 1.657 1
    EMR 64C, 4 Inst 1607.417 1 NA

    Results may vary due to factors including system configurations, software versions and BIOS settings. TPC, TPC Benchmark and TPC-C are trademarks of the Transaction Processing Performance Council.

    11 9xx5-009: Llama3.1-8B throughput results based on AMD internal testing as of 09/05/2024.

    Llama3-8B configurations: IPEX.LLM 2.4.0, NPS=2, BF16, batch size 4, Use Case Input/Output token configurations: [Summary = 1024/128, Chatbot = 128/128, Translate = 1024/1024, Essay = 128/1024, Caption = 16/16].

    2P AMD EPYC 9965 (384 Total Cores), 6 64C instances 1.5TB 24x64GB DDR5-6400 (at 6000 MT/s), 1 DPC, 1.0 Gbps NetXtreme BCM5720 Gigabit Ethernet PCIe, 3.5 TB Samsung MZWLO3T8HCLS-00A07 NVMe®, Ubuntu® 22.04.3 LTS, 6.8.0-40-generic (tuned-adm profile throughput-performance, ulimit -l 198096812, ulimit -n 1024, ulimit -s 8192) , BIOS RVOT1000C, (SMT=off, Determinism=Power, Turbo Boost=Enabled), NPS=2

    2P AMD EPYC 9755 (256 Total Cores), 4 64C instances , 1.5TB 24x64GB DDR5-6400 (at 6000 MT/s), 1DPC, 1.0 Gbps NetXtreme BCM5720 Gigabit Ethernet PCIe, 3.5 TB Samsung MZWLO3T8HCLS-00A07 NVMe®, Ubuntu 22.04.3 LTS, 6.8.0-40-generic (tuned-adm profile throughput-performance, ulimit -l 198096812, ulimit -n 1024, ulimit -s 8192), BIOS RVOT1000C (SMT=off, Determinism=Power, Turbo Boost=Enabled), NPS=2

    2P AMD EPYC 9654 (192 Total Cores) 4 48C instances , 1.5TB 24x64GB DDR5-4800, 1DPC, 1.0 Gbps NetXtreme BCM5720 Gigabit Ethernet PCIe, 3.5 TB Samsung MZWLO3T8HCLS-00A07 NVMe®, Ubuntu® 22.04.4 LTS, 5.15.85-051585-generic (tuned-adm profile throughput-performance, ulimit -l 1198117616, ulimit -n 500000, ulimit -s 8192), BIOS RVI1008C (SMT=off, Determinism=Power, Turbo Boost=Enabled), NPS=2

    Versus 2P Xeon Platinum 8592+ (128 Total Cores), 2 64C instances , AMX On, 1TB 16x64GB DDR5-5600, 1DPC, 1.0 Gbps NetXtreme BCM5719 Gigabit Ethernet PCIe, 3.84 TB KIOXIA KCMYXRUG3T84 NVMe®, Ubuntu 22.04.4 LTS 6.5.0-35-generic (tuned-adm profile throughput-performance, ulimit -l 132065548, ulimit -n 1024, ulimit -s 8192), BIOS ESE122V (SMT=off, Determinism=Power, Turbo Boost = Enabled).
    Results:

    CPU 2P EMR 64c 2P Turin 192c 2P Turin 128c 2P Genoa 96c
    Average Aggregate Median Total Throughput 99.474 193.267 182.595 138.978
    Competitive 1 1.943 1.836 1.397
    Generational NA 1.391 1.314 1

    Results may vary due to factors including system configurations, software versions and BIOS settings.

    12 9xx5-087: As of 10/10/2024; this scenario contains several assumptions and estimates and, while based on AMD internal research and best approximations, should be considered an example for information purposes only, and not used as a basis for decision making over actual testing.

    Referencing 9XX5-056A: “2P AMD EPYC 9575F powered server and 8x AMD Instinct MI300X GPUs running Llama3.1-70B select inference workloads at FP8 precision vs 2P Intel Xeon Platinum 8592+ powered server and 8x AMD Instinct MI300X GPUs has ~8% overall throughput increase across select inference use cases” and 8763.52 tokens/s (9575F) versus 8,048.48 tokens/s (8592+) at 128 input / 2048 output tokens, 500 prompts for 1.089x the tokens/s or 715.04 more tokens/s.

    1 Node = 2 CPUs and 8 GPUs.
    Assuming a 1000 node cluster, 1000 * 715.04 = 715,040 tokens/s

    For ~700,000 more tokens/s

    Results may vary due to factors including system configurations, software versions and BIOS settings.

    13 9xx5TCO-001a: This scenario contains many assumptions and estimates and, while based on AMD internal research and best approximations, should be considered an example for information purposes only, and not used as a basis for decision making over actual testing. The AMD Server & Greenhouse Gas Emissions TCO (total cost of ownership) Estimator Tool – version 1.12, compares the selected AMD EPYC™ and Intel® Xeon® CPU based server solutions required to deliver a TOTAL_PERFORMANCE of 39100 units of SPECrate2017_int_base performance as of October 10, 2024. This scenario compares a legacy 2P Intel Xeon 28 core Platinum_8280 based server with a score of 391 versus 2P EPYC 9965 (192C) powered server with an score of 3030 (https://spec.org/cpu2017/results/res2024q3/cpu2017-20240923-44833.pdf) along with a comparison upgrade to a 2P Intel Xeon Platinum 8592+ (64C) based server with a score of 1130 (https://spec.org/cpu2017/results/res2024q3/cpu2017-20240701-43948.pdf). Actual SPECrate®2017_int_base score for 2P EPYC 9965 will vary based on OEM publications.

    Environmental impact estimates made leveraging this data, using the Country / Region specific electricity factors from the 2024 International Country Specific Electricity Factors 10 – July 2024 , and the United States Environmental Protection Agency ‘Greenhouse Gas Equivalencies Calculator’.

    For additional details, see https://www.amd.com/en/claims/epyc5#9xx5TCO-001a

    14 9xx5-083: 5th Gen EPYC processors support DDR5-6400 MT/s for targeted customers and configurations. 5th Gen production SKUs support up to DDR5-6000 MT/s to enable a broad set of DIMMs across all OEM platforms and maintain SP5 platform compatibility

    A photo accompanying this announcement is available at https://www.globenewswire.com/NewsRoom/AttachmentNg/3bb614ee-e307-43a7-a36b-f5bd02ed1335

    The MIL Network –

    January 23, 2025
  • MIL-OSI: Alliance Witan PLC – Net Asset Value

    Source: GlobeNewswire (MIL-OSI)

    ALLIANCE WITAN PLC
                     
    At the close of business on Wednesday 09 October 2024:

    The Company’s NAV per ordinary share, valued on a bid price basis with Debt at Par, was

    –       excluding income, 1265.9p
                     
    –       including income, 1266.7p
      
    The Company’s NAV per ordinary share, valued on a bid price basis with Debt at Fair Value, was

    –       excluding income, 1283.2p

    –       including income, 1284.0p

    For further information, please contact: –

     
    Juniper Partners Limited
    Tel. +44 (0)131 378 0500

    Notes

    1. Net Asset Values are calculated in accordance with published accounting policies and AIC guidelines.
    2. The fair value of the Company’s fixed loan notes is calculated by reference to a benchmark gilt.

    The MIL Network –

    January 23, 2025
  • MIL-OSI: Publication of a Circular – Notice of General Meeting

    Source: GlobeNewswire (MIL-OSI)

    NOT FOR RELEASE, PUBLICATION OR DISTRIBUTION IN WHOLE OR IN PART, DIRECTLY OR INDIRECTLY, IN, INTO OR FROM THE UNITED STATES, CANADA, AUSTRALIA, JAPAN, THE REPUBLIC OF SOUTH AFRICA OR ANY OTHER JURISDICTION WHERE TO DO SO WOULD CONSTITUTE A VIOLATION OF THE RELEVANT LAWS OR REGULATIONS OF THAT JURISDICTION. THE INFORMATION CONTAINED HEREIN DOES NOT CONSTITUTE AN OFFER OF SECURITIES FOR SALE IN ANY JURISDICTION, INCLUDING IN THE UNITED STATES, CANADA, AUSTRALIA, JAPAN OR THE REPUBLIC OF SOUTH AFRICA.

    HARGREAVE HALE AIM VCT PLC

    LEI: 213800LRYA19A69SIT31

    10 October 2024

    Publication of a circular

    On 9 October 2024, Hargreave Hale AIM VCT plc (the “Company“) launched an offer for subscription to raise up to £20 million (the “Offer“).

    The Company has also published a circular convening a general meeting (the “General Meeting“) to be held at 9.30 a.m. on 12 November 2024 at the offices of Canaccord Genuity Asset Management Limited, 88 Wood Street, London EC2V 7QR (the “Circular“). At the General Meeting, shareholders will be asked to approve: (i) share issuance authorities in relation to the Offer; and (ii) amendments to the Company’s articles of association in order to extend the date of the next continuation vote to the annual general meeting of the Company to be held in 2031.

    The Circular is available to download from the Company’s website, http://www.hargreaveaimvcts.co.uk, subject to certain access restrictions and will also shortly be available for inspection at the National Storage Mechanism, https://data.fca.org.uk/#/nsm/nationalstoragemechanism .

    For further information please contact:
    Oliver Bedford, Canaccord Genuity Asset Management Limited
    Tel: 020 7523 4837

    Important Information
    This announcement and the information contained herein is not intended to, and does not, constitute or form part of any offer, invitation, or the solicitation of an offer, to purchase, otherwise acquire, subscribe for, sell or otherwise dispose of any securities or the solicitation of any vote or approval in any jurisdiction.

    The distribution of this announcement in jurisdictions other than the United Kingdom and the availability of the Offer to persons who are not resident in the United Kingdom may be affected by the laws of relevant jurisdictions. Therefore any persons who are subject to the laws of any jurisdiction other than the United Kingdom will need to inform themselves about, and observe, any applicable requirements.

    The MIL Network –

    January 23, 2025
  • MIL-OSI: Capgemini’s World Energy Markets Observatory annual report 2024: The Paris Agreement’s goals are no longer achievable, but net zero is still in sight with accelerated efforts

    Source: GlobeNewswire (MIL-OSI)

    Press contact:
    Florence Lievre
    Tel.: +33 1 47 54 50 71
    Email: florence.lievre@capgemini.com

    Capgemini’s World Energy Markets Observatory annual report 2024:
    The Paris Agreement’s goals are no longer achievable, but net zero is still in sight with accelerated efforts

    • Despite impressive strides in 2023 and positive projections for 2024, the pace of renewable development isn’t fast enough
    • The critical role of nuclear energy to addressing increased clean energy demands is now recognized, but construction of new large power plants takes time and industrialization of Small Modular Reactors (SMRs) is proving complex
    • Addressing the complexity of energy transition challenges will require new market mechanisms encouraging further innovation, choosing appropriate measures, and accelerated public and private investment in low carbon technologies and the power grid

    Paris, October 10, 2024 – Capgemini has published the 26thedition of its annual World Energy Markets Observatory (WEMO), created in partnership with Hogan Lovells, Vaasa ETT and Enerdata. The report takes stock of the current state of the energy transition. Despite progress being made, greenhouse gas (GHG) emissions are continuing to increase, reaching a new record high of 37.4 billion tonnes (Gt) in 20231, confirming that the path to the reach Paris Agreement’s objectives is not on track. The report provides insights on what the key focus areas would need to be, moving forward, to address the complex energy transition challenges, including a change in the measurement of clean energy progress, as well as accelerated investment in the power grid and clean technologies.

    James Forrest, Global Energy Transition & Utilities Industry Leader at Capgemini says: “Despite an historical spike in renewable penetration, the pace of development isn’t fast enough to close the gap. There is still much to do in the next decade to get closer to net zero by 2050 and achieve a successful energy transition: whether it be in the field of low carbon technologies, R&D efforts, nuclear or grid flexibility and storage. In addition, beyond the necessary adoption of new market mechanisms, a shift away from measuring energy based on primary consumption is needed. This measurement was relevant during past energy crises, but it is now time to adopt a more holistic approach. Moving to a final energy demand measurement would better assess clean energy progress and ensure more accurate projections.”

    Key observations from the 2024 report include:

    • There is a need to hasten the deployment of renewable energy globally, and to accelerate in developing countries, to deliver the 2030 and 2050 decarbonization goals. The total amount of final energy provided by renewable energy is likely to be limited to about 40% of global needs. In 2023, total renewable energy capacity increased by 14% year on year with a larger capacity expansion of solar (32%) than wind (13%). But, whilst 2024 is promising to hit another record, as this was the case for the 22nd previous years, this growth is far below what is needed to achieve net zero carbon in 2050. Moreover, while the renewable penetration rate increases, they are impacting grid stability and association with stationary batteries will become compulsory. According to the report, storable renewable energies development, such as biomass or geothermal energy, should be accelerated.
    • Hydrogen is now a strategic lever in the decarbonization path. The number of projects reaching final investment decision has quadrupled over the last two years. However, a refocus of applications has been observed due to the increasing costs of low-carbon hydrogen production, competition between uses, and regulations. Only certain uses in ‘Hard to Abate’ industries, such as heavy industry and maritime mobility, have strong potential.
    • Global nuclear capacity needs to triple to ensure stable, low-carbon power. COP28 has recognized the critical role of nuclear energy for reducing the effects of climate change. While there is some promising progress in nuclear renaissance, including Small Modular Reactors (SMRs), development of new nuclear power plants is still difficult. In 2023, 440 nuclear reactors (390 GW) provided 9% of the world’s electricity, 25% of the world’s low-carbon electricity. SMRs are in the planning or early construction stages with many years before they are deployed at scale as their industrialization can prove to be complex. According to the report, more focus needs to be placed on extending the life of existing nuclear plants.
    • The power grid plays a fundamental role to accelerate clean energy transitions. Grid investment is starting to pick up and is expected to reach USD 400 billion in 20242, with Europe, the United Sates, China and parts of Latin America leading the way. According to the report, better forecasting electricity consumption and finer optimization scenarios thanks to technologies such as AI will help to improve grid balancing.
    • Whilst AI has the potential to significantly accelerate decarbonization, a lack of skills and a focus on short-term proof of concepts is hampering adoption to date. However, AI coupled with GenAI in agentic LLM (Large Language Model) workflows3 has a clear role to play as a catalyst to improve grids efficiency, e-fuel discovery; new battery or wind turbine design; synthetic biology; and augmented insights from many data sources for better informed decision making.
      • Protectionist approaches to increasing energy sovereignty may have undesirable implications. Ongoing geopolitical uncertainties are affecting energy markets and systems. To ensure security of supply, the use of embargoes, tariffs and subsidies in almost all jurisdictions is distorting energy markets and threatens efficient allocation of capital. According to the report, embargoes are proving ineffective, and decreasing the transparency and traceability of energy supplies, which is essential to tracking decarbonization efforts. Denying access to the cheapest sources of energy equipment and energy supplies drives up prices for consumers and reduces funding available for the energy transition.
      • According to the report, ‘Primary Energy Demand’ is an outdated concept for energy transition. There is a need to move from primary to final energy consumption measurement (in kWh) to ensure accurate projections, and clean energy progress. Measuring energy based on primary consumption ignores that: for the same end-energy services, new electric services are generally more efficient; a lot of fossil fuels are wasted in the generation of electricity; energy is also wasted on finding and processing fossil fuels.

    The World Energy Markets Observatory (WEMO) is Capgemini’s annual thought leadership and research report created in partnership with Hogan Lovells, Vaasa ETT and Enerdata, that tracks the transformation of global energy markets, including Europe, North America, Australia, Southeast Asia, India, and China. Now in its 26th edition, the report has been prepared by a global team of over 100 experts, and includes 15 articles, all backed with rigorous analysis. The report begins with a global outlook, then covers the topics pivotal to the energy transition including geopolitical impacts, demand side energy transition, batteries, renewables, SMRs, Hydrogen, Industrial Heat, GenAI and the Inflation Reduction Act (IRA).
    For more information and to get access to the report, click here

    About Capgemini
    Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of €22.5 billion.
    Get The Future You Want | http://www.capgemini.com


    1 Source: IEA- CO2 Emissions in 2023
    2 Source IEA: Electricity Grids and Secure Energy Transitions

    3 GenAI in agentic LLM (Large Language Model): iterative and collaborative model that transforms the interaction with LLMs into a series of manageable, refinable steps.

    Attachments

    • Infographic – WEMO 2024_Capgemini
    • 2024_10_10_Capgemini_WEMO_Press Release_EN

    The MIL Network –

    January 23, 2025
←Previous Page
1 … 689 690 691 692 693 … 735
Next Page→
NewzIntel.com

NewzIntel.com

MIL Open Source Intelligence

  • Blog
  • About
  • FAQs
  • Authors
  • Events
  • Shop
  • Patterns
  • Themes

Twenty Twenty-Five

Designed with WordPress