Category: Science

  • MIL-OSI: Karolinska Development’s portfolio company Modus Therapeutics carries out a fully secured rights issue of SEK 28.3 million

    Source: GlobeNewswire (MIL-OSI)

    STOCKHOLM, SWEDEN June 27, 2025. Karolinska Development AB (Nasdaq Stockholm: KDEV) today announces that its portfolio company, Modus Therapeutics carries out a fully secured rights issue of units of SEK 28.3 million. The proceeds from the rights issue are intended to finance the continued development of the drug candidate sevuparin in chronic kidney disease with anemia.

    On June 26 2025, the portfolio company Modus Therapeutics, listed on Nasdaq First North Growth Market, announced that the company is carrying out a fully secured rights issue of units that, upon full subscrption will provide the company with SEK 28.3 million before issue costs. The rights issue is subject to approval by an extraordinary general meeting held on July 29, 2025.

    The purpose of the rights issue is to provide capital for the continued clinical development of the drug candidate sevuparin, including completing the ongoing clinical phase II study and to finance the operations through the end of 2026.

    A number of Modus Therapeutics major shareholders, including Karolinska Development, Hans Wigzell and Anders Bladh, have entered into free subscription commitments totaling SEK 17.7 million, corresponding to 62.7 percent of the Rights Issue. The remaining portion, corresponding to 37.3 percent, is covered by underwriting commitments from external parties.

    “Securing a fully subscribed rights issue in today’s challenging market is a clear sign of strength for Modus Therapeutics and its clinical strategy. As the largest owner, we are very pleased that Modus has now secured financing for the continued development of sevuparin, enabling the company to reach important milestones in the near future,” says Viktor Drvota, CEO of Karolinska Development.

    Karolinska Development’s ownership in Modus Therapeutics amounts to 66 percent.

    For further information, please contact:

    Viktor Drvota, CEO, Karolinska Development AB
    Phone: +46 73 982 52 02, e-mail: viktor.drvota@karolinskadevelopment.com

    Johan Dighed, General Counsel and Deputy CEO, Karolinska Development AB
    Phone: +46 70 207 48 26, e-mail: johan.dighed@karolinskadevelopment.com

    TO THE EDITORS

    About Karolinska Development AB

    Karolinska Development AB (Nasdaq Stockholm: KDEV) is a Nordic life sciences investment company. The company focuses on identifying breakthrough medical innovations in the Nordic region that are developed by entrepreneurs and leadership teams. The Company invests in the creation and growth of companies that advance these assets into commercial products that are designed to make a difference to patients’ lives while providing an attractive return on investment to shareholders.

    Karolinska Development has access to world-class medical innovations at the Karolinska Institutet and other leading universities and research institutes in the Nordic region. The Company aims to build companies around scientists who are leaders in their fields, supported by experienced management teams and advisers, and co-funded by specialist international investors, to provide the greatest chance of success.

    Karolinska Development has a portfolio of eleven companies targeting opportunities in innovative treatment for life-threatening or serious debilitating diseases.

    The Company is led by an entrepreneurial team of investment professionals with a proven track record as company builders and with access to a strong global network.

    For more information, please visit www.karolinskadevelopment.com.

    Attachment

    The MIL Network

  • MIL-OSI Analysis: What a 19th-century atlas teaches me about marine ecosystems

    Source: The Conversation – UK – By Ruth H. Thurstan, Associate Professor in Marine and Historical Ecology, University of Exeter

    Ruth Thurstan holds the Piscatorial Atlas Credit: Lee Raby, CC BY-NC-ND

    What stands out most about the book I’m carrying under my arm, as I meander through the exhibits at the National Maritime Museum Cornwall in Falmouth, is its awkwardly large size. The Piscatorial Atlas, authored by Ole Theodor Olsen and published in 1883, contains 50 beautifully illustrated charts of the seas around Great Britain. These show the locations exploited at that time for a variety of fish species, alongside the typical vessels or fishing gear used. This information was collated from fishermen in the decade before the atlas was published.

    The atlas isn’t a book made for travel. Luckily, it can be readily admired online. But leafing through its carefully curated pages, which contain the collective knowledge of so many people who have long since passed away, feels special, and is why I chose it to show to the programme producers today.

    I’ve always loved old books, but I never imagined they would become such an integral part of my work. My interest in marine historical ecology – the use of historical archives to make sense of how our ocean ecosystems are changing – started 18 years ago when I read The Unnatural History of the Sea by Professor Callum Roberts. Within its pages it details how historical perspectives provide critical insights into the deteriorating health of our seas.



    Local science, global stories.

    This article is part of a series, Secrets of the Sea, exploring how marine scientists are developing climate solutions.

    In collaboration with the BBC, Anna Turns travels around the West Country coastline to meet ocean experts making exciting discoveries beneath the waves.


    In recent decades, fishery declines, degradation of coastal habitats and the loss of large predators show that exploitation, coastal development, pollution and climate change are exacting their toll on marine ecosystems.

    Yet information extracted from old books, reports, and even newspaper articles, show us that many of these issues started long ago. We have exploited the seas for thousands of years, but in Britain, the 19th-century introduction of steam power was a watershed moment. A point in time when our ability to exploit the seas abruptly and dramatically increased. My research aims to uncover how our use of this technological advance – and those that followed – have affected the functioning of marine ecosystems and their continued ability to support our needs.

    Transformation of the seas

    These negative effects are profound. Towards the end of the Piscatorial Atlas is a page dedicated to the native oyster (Ostrea edulis). It is my favourite of the charts. A gradation of colour indicates where oysters were found in abundance at this time. Colour surrounds the coastal seas of Britain and further afield. Strikingly, there is an enormous area of oyster ground delineated in the southern North Sea.

    Today, the native oyster ecosystem is defined as collapsed. The decline of nearshore oyster reefs was well underway by the time the Piscatorial Atlas was published, and the loss of the large North Sea oyster ground – so clear on Olsen’s chart – swiftly followed. As those with the knowledge of these once prolific grounds passed away, the memory of the once vast oyster habitats was lost. This problem was further compounded by science. In the late 19th century, studies of oyster grounds were rare, and scientific surveys almost always occurred after the habitat had been destroyed. Low densities of oysters became the scientific norm.

    Recent research I was involved in with a team of experts used historical sources from across Europe to show just how much change has occurred. We showed that reported native oyster habitat once covered tens of thousands of square kilometres and was a dominant feature of some coastal ecosystems. Multiple layers of old oyster shell, consolidated by a layer of living oysters, provided raised reefs that supported a diverse range of species.

    The economic and cultural significance of oysters created a more visible historical record than many other species. Yet, the history of marine declines is not limited to oysters. Historical sources quote fishermen concerned about the expansion of trawling and fishing effort. They described the efficiency with which sail trawlers and early steam-powered vessels extracted fish and non-target species from the seafloor.

    The impact of land-based activities, such as sediment and pollutant run-off and coastal development, also increased as societies industrialised. These placed marine ecosystems under further pressure, yet regulations governing sustainable management of our seas failed to keep up. These influences, coupled with a collective societal amnesia regarding what we have lost, facilitated the hidden transformation of marine ecosystems.

    Using old books and other deep-time approaches, researchers are increasingly making these transformations visible. Reading the words of people from centuries ago, we learn that their experiences of marine ecosystems were often fundamentally different from our own. Understanding the scale of this difference, where species and habitats existed, and in what abundances, can help make the case for their conservation and restoration.

    People have always made use of the seas. For me, looking to the past isn’t just about understanding what we have lost, it is also about taking positive lessons from the past, such as the myriad ways in which societies benefited from the presence of healthy marine ecosystems. Heeding these lessons from history helps us visualise the full range of possible futures available to us, including the many benefits that more ambitious conservation and restoration of our ocean ecosystems could bring, should we choose this path.

    Ruth H. Thurstan works for The University of Exeter. She receives funding from the Convex Seascape Survey and the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement no. 856488).

    ref. What a 19th-century atlas teaches me about marine ecosystems – https://theconversation.com/what-a-19th-century-atlas-teaches-me-about-marine-ecosystems-251184

    MIL OSI Analysis

  • MIL-OSI Analysis: What a 19th-century atlas teaches me about marine ecosystems

    Source: The Conversation – UK – By Ruth H. Thurstan, Associate Professor in Marine and Historical Ecology, University of Exeter

    Ruth Thurstan holds the Piscatorial Atlas Credit: Lee Raby, CC BY-NC-ND

    What stands out most about the book I’m carrying under my arm, as I meander through the exhibits at the National Maritime Museum Cornwall in Falmouth, is its awkwardly large size. The Piscatorial Atlas, authored by Ole Theodor Olsen and published in 1883, contains 50 beautifully illustrated charts of the seas around Great Britain. These show the locations exploited at that time for a variety of fish species, alongside the typical vessels or fishing gear used. This information was collated from fishermen in the decade before the atlas was published.

    The atlas isn’t a book made for travel. Luckily, it can be readily admired online. But leafing through its carefully curated pages, which contain the collective knowledge of so many people who have long since passed away, feels special, and is why I chose it to show to the programme producers today.

    I’ve always loved old books, but I never imagined they would become such an integral part of my work. My interest in marine historical ecology – the use of historical archives to make sense of how our ocean ecosystems are changing – started 18 years ago when I read The Unnatural History of the Sea by Professor Callum Roberts. Within its pages it details how historical perspectives provide critical insights into the deteriorating health of our seas.



    Local science, global stories.

    This article is part of a series, Secrets of the Sea, exploring how marine scientists are developing climate solutions.

    In collaboration with the BBC, Anna Turns travels around the West Country coastline to meet ocean experts making exciting discoveries beneath the waves.


    In recent decades, fishery declines, degradation of coastal habitats and the loss of large predators show that exploitation, coastal development, pollution and climate change are exacting their toll on marine ecosystems.

    Yet information extracted from old books, reports, and even newspaper articles, show us that many of these issues started long ago. We have exploited the seas for thousands of years, but in Britain, the 19th-century introduction of steam power was a watershed moment. A point in time when our ability to exploit the seas abruptly and dramatically increased. My research aims to uncover how our use of this technological advance – and those that followed – have affected the functioning of marine ecosystems and their continued ability to support our needs.

    Transformation of the seas

    These negative effects are profound. Towards the end of the Piscatorial Atlas is a page dedicated to the native oyster (Ostrea edulis). It is my favourite of the charts. A gradation of colour indicates where oysters were found in abundance at this time. Colour surrounds the coastal seas of Britain and further afield. Strikingly, there is an enormous area of oyster ground delineated in the southern North Sea.

    Today, the native oyster ecosystem is defined as collapsed. The decline of nearshore oyster reefs was well underway by the time the Piscatorial Atlas was published, and the loss of the large North Sea oyster ground – so clear on Olsen’s chart – swiftly followed. As those with the knowledge of these once prolific grounds passed away, the memory of the once vast oyster habitats was lost. This problem was further compounded by science. In the late 19th century, studies of oyster grounds were rare, and scientific surveys almost always occurred after the habitat had been destroyed. Low densities of oysters became the scientific norm.

    Recent research I was involved in with a team of experts used historical sources from across Europe to show just how much change has occurred. We showed that reported native oyster habitat once covered tens of thousands of square kilometres and was a dominant feature of some coastal ecosystems. Multiple layers of old oyster shell, consolidated by a layer of living oysters, provided raised reefs that supported a diverse range of species.

    The economic and cultural significance of oysters created a more visible historical record than many other species. Yet, the history of marine declines is not limited to oysters. Historical sources quote fishermen concerned about the expansion of trawling and fishing effort. They described the efficiency with which sail trawlers and early steam-powered vessels extracted fish and non-target species from the seafloor.

    The impact of land-based activities, such as sediment and pollutant run-off and coastal development, also increased as societies industrialised. These placed marine ecosystems under further pressure, yet regulations governing sustainable management of our seas failed to keep up. These influences, coupled with a collective societal amnesia regarding what we have lost, facilitated the hidden transformation of marine ecosystems.

    Using old books and other deep-time approaches, researchers are increasingly making these transformations visible. Reading the words of people from centuries ago, we learn that their experiences of marine ecosystems were often fundamentally different from our own. Understanding the scale of this difference, where species and habitats existed, and in what abundances, can help make the case for their conservation and restoration.

    People have always made use of the seas. For me, looking to the past isn’t just about understanding what we have lost, it is also about taking positive lessons from the past, such as the myriad ways in which societies benefited from the presence of healthy marine ecosystems. Heeding these lessons from history helps us visualise the full range of possible futures available to us, including the many benefits that more ambitious conservation and restoration of our ocean ecosystems could bring, should we choose this path.

    Ruth H. Thurstan works for The University of Exeter. She receives funding from the Convex Seascape Survey and the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement no. 856488).

    ref. What a 19th-century atlas teaches me about marine ecosystems – https://theconversation.com/what-a-19th-century-atlas-teaches-me-about-marine-ecosystems-251184

    MIL OSI Analysis

  • MIL-OSI Analysis: What a 19th-century atlas teaches me about marine ecosystems

    Source: The Conversation – UK – By Ruth H. Thurstan, Associate Professor in Marine and Historical Ecology, University of Exeter

    Ruth Thurstan holds the Piscatorial Atlas Credit: Lee Raby, CC BY-NC-ND

    What stands out most about the book I’m carrying under my arm, as I meander through the exhibits at the National Maritime Museum Cornwall in Falmouth, is its awkwardly large size. The Piscatorial Atlas, authored by Ole Theodor Olsen and published in 1883, contains 50 beautifully illustrated charts of the seas around Great Britain. These show the locations exploited at that time for a variety of fish species, alongside the typical vessels or fishing gear used. This information was collated from fishermen in the decade before the atlas was published.

    The atlas isn’t a book made for travel. Luckily, it can be readily admired online. But leafing through its carefully curated pages, which contain the collective knowledge of so many people who have long since passed away, feels special, and is why I chose it to show to the programme producers today.

    I’ve always loved old books, but I never imagined they would become such an integral part of my work. My interest in marine historical ecology – the use of historical archives to make sense of how our ocean ecosystems are changing – started 18 years ago when I read The Unnatural History of the Sea by Professor Callum Roberts. Within its pages it details how historical perspectives provide critical insights into the deteriorating health of our seas.



    Local science, global stories.

    This article is part of a series, Secrets of the Sea, exploring how marine scientists are developing climate solutions.

    In collaboration with the BBC, Anna Turns travels around the West Country coastline to meet ocean experts making exciting discoveries beneath the waves.


    In recent decades, fishery declines, degradation of coastal habitats and the loss of large predators show that exploitation, coastal development, pollution and climate change are exacting their toll on marine ecosystems.

    Yet information extracted from old books, reports, and even newspaper articles, show us that many of these issues started long ago. We have exploited the seas for thousands of years, but in Britain, the 19th-century introduction of steam power was a watershed moment. A point in time when our ability to exploit the seas abruptly and dramatically increased. My research aims to uncover how our use of this technological advance – and those that followed – have affected the functioning of marine ecosystems and their continued ability to support our needs.

    Transformation of the seas

    These negative effects are profound. Towards the end of the Piscatorial Atlas is a page dedicated to the native oyster (Ostrea edulis). It is my favourite of the charts. A gradation of colour indicates where oysters were found in abundance at this time. Colour surrounds the coastal seas of Britain and further afield. Strikingly, there is an enormous area of oyster ground delineated in the southern North Sea.

    Today, the native oyster ecosystem is defined as collapsed. The decline of nearshore oyster reefs was well underway by the time the Piscatorial Atlas was published, and the loss of the large North Sea oyster ground – so clear on Olsen’s chart – swiftly followed. As those with the knowledge of these once prolific grounds passed away, the memory of the once vast oyster habitats was lost. This problem was further compounded by science. In the late 19th century, studies of oyster grounds were rare, and scientific surveys almost always occurred after the habitat had been destroyed. Low densities of oysters became the scientific norm.

    Recent research I was involved in with a team of experts used historical sources from across Europe to show just how much change has occurred. We showed that reported native oyster habitat once covered tens of thousands of square kilometres and was a dominant feature of some coastal ecosystems. Multiple layers of old oyster shell, consolidated by a layer of living oysters, provided raised reefs that supported a diverse range of species.

    The economic and cultural significance of oysters created a more visible historical record than many other species. Yet, the history of marine declines is not limited to oysters. Historical sources quote fishermen concerned about the expansion of trawling and fishing effort. They described the efficiency with which sail trawlers and early steam-powered vessels extracted fish and non-target species from the seafloor.

    The impact of land-based activities, such as sediment and pollutant run-off and coastal development, also increased as societies industrialised. These placed marine ecosystems under further pressure, yet regulations governing sustainable management of our seas failed to keep up. These influences, coupled with a collective societal amnesia regarding what we have lost, facilitated the hidden transformation of marine ecosystems.

    Using old books and other deep-time approaches, researchers are increasingly making these transformations visible. Reading the words of people from centuries ago, we learn that their experiences of marine ecosystems were often fundamentally different from our own. Understanding the scale of this difference, where species and habitats existed, and in what abundances, can help make the case for their conservation and restoration.

    People have always made use of the seas. For me, looking to the past isn’t just about understanding what we have lost, it is also about taking positive lessons from the past, such as the myriad ways in which societies benefited from the presence of healthy marine ecosystems. Heeding these lessons from history helps us visualise the full range of possible futures available to us, including the many benefits that more ambitious conservation and restoration of our ocean ecosystems could bring, should we choose this path.

    Ruth H. Thurstan works for The University of Exeter. She receives funding from the Convex Seascape Survey and the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement no. 856488).

    ref. What a 19th-century atlas teaches me about marine ecosystems – https://theconversation.com/what-a-19th-century-atlas-teaches-me-about-marine-ecosystems-251184

    MIL OSI Analysis

  • MIL-OSI Global: What Danish climate migration drama, Families Like Ours, gets wrong about rising sea levels

    Source: The Conversation – UK – By Florian Steig, DPhil Student, Geography and the Environment, University of Oxford

    In the Danish TV drama Families Like Ours, one melancholic line from high-school student Laura captures the emotional toll of climate displacement: “Soon we will vanish like bubbles in a creek.” This seven-part series imagines a near future in which Denmark is being evacuated due to rising sea levels – a government-mandated relocation of an entire population.

    The series challenges the fantasy that wealthy western countries are immune to the far-reaching effects of climate change. Rather than focusing on catastrophic storylines, Families Like Ours portrays the mundane, bureaucratic and affective aspects of relocating a population in anticipation of a creeping crisis: the scramble for visas, the fractures that appear between families, and the inequalities in social and economic capital that shape people’s chances for a new life.

    Yet, the idea that Denmark could soon get submerged is not grounded in science. More worryingly, the narrative of the unavoidable uninhabitability of entire nations and millions of international migrants flooding Europe is misleading, dangerous, and sidelines deeply political questions about adaptation to sea level rise that should be dealt with now.

    The trailer for Families Like Ours.

    Sea levels are rising by a few millimetres a year. That pace is accelerating. The Intergovernmental Panel on Climate Change predicts that, by 2100, sea levels could rise by up to one metre on average. Beyond 2100, sea levels could rise by several metres, although these long-term scenarios are highly uncertain.

    Even in extreme scenarios, these developments would unfold over several decades and centuries. It’s unlikely that permanent submergence of large areas of land will make Denmark uninhabitable.

    Still, sea level rise poses a serious risk to the livelihoods of millions of people living in coastal zones. In the UK, many homes in Norfolk and Fairbourne, Wales, are already at risk from coastal erosion, for instance.

    These changes are subtle. They do not warrant the evacuation of an entire nation, but degrade coastal livelihoods over time. Houses in high-risk areas like these may become uninsurable, devalued or too risky to live in. This will force people to move.

    In addition, sea level rise makes coastal flooding more likely. In European high-income countries, including Denmark, rising waters already threaten coastal communities. Without adaptation, hundreds of thousands of homes in cities such as Copenhagen could be at risk.

    The danger of mass migration narratives

    However, depicting climate change as a driver of uncontrolled mass migration is misleading. Sea level rise will contribute to coastal migration, and state-led relocation is already a reality especially in Africa and Asia. But climate migration predominantly occurs within countries or regions. International migration from climate change impacts is the exception, not the norm.

    To capture these complexities, some researchers prefer the term “climate mobility”. Mobility can be forced or voluntary, permanent or temporary, even seasonal. Some communities and people resist relocation plans and stay put.

    Families Like Ours reinforces longstanding narratives that frame certain parts of the world as destined to become uninhabitable. Even UN Secretary-General António Guterres warned of a “mass exodus of entire populations on a biblical scale” due to sea level rise.

    As a researcher working on climate adaptation, I notice that sea level rise and climate migration are increasingly discussed at the global level. Discussions focus, for example, on the protection of affected populations and continued statehood of nations after their potential submergence. A new global alliance of cities and regions tackling sea level rise called the Ocean Rise & Coastal Resilience Coalition considers a “managed retreat” not only as inevitable but as a rational and desirable adaptation pathway for many cities and regions.

    Scientists have warned that creative storylines highlighting the “uninhabitability” of low-lying countries and regions, such as the Pacific, are not helpful. The mass migration narrative can be used by governments to justify extreme protectionist action and sideline urgent adaptation debates.

    States are not helpless in the face of sea level rise and submergence is not inevitable. As geographer Carol Farbotko and colleagues suggest, “habitability is mediated by human actions and is not a direct consequence of environmental change”. People often develop their own ways of living with rising waters, resisting narratives of submergence. State-led adaptation is possible, but depends on finance, which is unequally distributed.

    People’s migration decisions can seldomly be attributed to just climate impact. A community’s capacity to respond hinges on social, political, economic and demographic factors. Adaptation measures are costly. This raises deeply political questions over who gets to be protected, who is left behind, and how managed retreat can benefit the most affected people and places in a fair way. We need to overcome mass migration myths and start a serious and justice-focused debate about the future of our shorelines.


    Don’t have time to read about climate change as much as you’d like?

    Get a weekly roundup in your inbox instead. Every Wednesday, The Conversation’s environment editor writes Imagine, a short email that goes a little deeper into just one climate issue. Join the 45,000+ readers who’ve subscribed so far.


    Florian Steig receives funding from the German Academic Scholarship Foundation (Studienstiftung des deutschen Volkes).

    ref. What Danish climate migration drama, Families Like Ours, gets wrong about rising sea levels – https://theconversation.com/what-danish-climate-migration-drama-families-like-ours-gets-wrong-about-rising-sea-levels-259234

    MIL OSI – Global Reports

  • MIL-OSI Analysis: Could the first images from the Vera Rubin telescope change how we view space for good?

    Source: The Conversation – UK – By Professor Manda Banerji, Professor of Astrophysics, School of Physics & Astronomy, University of Southampton

    We are entering a new era of cosmic exploration. The new Vera C Rubin Observatory in Chile will transform astronomy with its extraordinary ability to map the universe in breathtaking detail. It is set to reveal secrets previously beyond our grasp. Here, we delve into the first images taken by Rubin’s telescope and what they are already showing us.

    These images vividly showcase the unprecedented power that Rubin will use to
    revolutionise astronomy and our understanding of the Universe. Rubin is truly transformative, thanks to its unique combination of sensitivity, vast sky area coverage and exceptional image quality.

    These pictures powerfully demonstrate those attributes. They reveal not only bright objects in exquisite detail but also faint structures, both near and far, across a large area of sky.

    Cosmic nurseries – nebulae in detail

    The stunning pink and blue clouds in this image are the Lagoon (lower left) and Trifid (upper right) nebulae. The word nebula comes from the Latin for cloud, and these giant clouds are truly enormous – so vast it takes light decades to travel across them. They are stellar nurseries, the very birth sites for the next generation of stars and planets in our Milky Way galaxy.

    The intense radiation from hot, young stars energises the gas particles, causing
    them to glow pink. Further from these nascent stars, colder regions consist of
    microscopic dust grains. These reflect starlight (a process known in astronomy as
    “scattering”), much like our atmosphere, creating the beautiful blue hues. Darker filaments within are much denser regions of dust, obscuring all but the brightest background stars.

    To detect these colours, astronomers use filters over their instruments, allowing only certain wavelengths of light onto the detectors. Rubin has six such filters, spanning from short ultraviolet (UV) wavelengths through the visible spectrum to longer near-infrared light. Combining information from these different filters enables detailed measurements of the properties of stars and gas, such as their temperature and size.

    Rubin’s speed – its ability to take an image with one filter and then quickly move to the next – combined with the sheer area of sky it can see at any one time, is what makes it so unique and so exciting. The level of detail, revealing the finest and faintest structures, will enable it to map the substructure and satellite galaxies of the Milky Way like never before.

    Mapping galaxies across billions of light years

    This image captures a small section of NSF–DOE Vera C. Rubin Observatory’s view of the Virgo Cluster, offering a vivid glimpse of the variety in the cosmos.
    Credit: NSF–DOE Vera C. Rubin Observatory

    The images of galaxies powerfully demonstrate the scale at which the Rubin
    observatory will map the universe beyond our own Milky Way. The large galaxies
    visible here (such as the two bright spiral shaped galaxies visible in the lower right quarter of the picture) belong to the Virgo cluster, a giant structure containing more than 1,000 galaxies, each holding billions to trillions of stars.

    This image beautifully showcases the huge diversity of shapes, sizes and colours of galaxies in our universe revealed by Rubin in their full technicolour glory. Inside these galaxies, bright dots are visible – these are star-forming regions, just like the Lagoon and Trifid nebulae, but remarkably, these are millions of light years away from us.

    The still image captures just 2% of the area of a full Rubin image revealing a universe that is teeming with celestial bodies. The full image, which contains around ten million galaxies, would need several hundred ultra high-definition TV screens to display in all its detail. By the end of its ten-year survey, Rubin will catalogue the properties of some 20 billion galaxies, their colours and locations on the sky containing information about even more mysterious components of our universe such as dark matter and dark energy. Dark matter makes up most of the matter in the cosmos, but does not reflect or emit light. Dark energy seems to be responsible for the accelerating expansion of the universe.

    The UK’s role

    These unfathomable numbers demand data processing on a whole new scale.
    Uncovering new discoveries from this data requires a giant collaborative effort, in which UK astronomy is playing a major role. The UK will process around 1.5 million Rubin images and hosts one of three international data access centres for the project, providing scientists across the globe with access to the vast Rubin data. Here at the University of Southampton, we are leading two critical software
    development contributions to Rubin.

    First of these is the capability to combine the Rubin images with those at longer infrared wavelengths. This extends the colours that Rubin sees, providing key diagnostic information about the properties of stars and galaxies. Second is the software that will link Rubin observations to another new instrument called 4MOST, soon to be installed at the Vista telescope in Chile.

    Part of 4MOST’s job will be to snap up and classify rapidly changing “sources”, or objects, in the sky that have been discovered by Rubin. One such type of rapidly changing source is a stellar explosion known as a supernova. We expect to have catalogued more supernova explosions within just two years than have ever been made previously. Our contributions to the Rubin project will therefore lead to a totally new understanding of how the stars and galaxies in our universe live and die, offering an unprecedented glimpse into the grand cosmic cycle.

    The Rubin observatory isn’t just a new telescope – it’s a new pair of eyes on the
    universe, revealing the cosmos in unprecedented detail. A treasure trove of
    discoveries await, but most interesting among them will be the hidden secrets of the universe that we are yet to contemplate. The first images from Rubin have been a spectacular demonstration of the vastness of the universe. What might we find in
    this gargantuan dataset of the cosmos as the ultimate timelapse movie of our
    universe unfolds?

    Professor Manda Banerji receives funding from the Royal Society and the Science and Technology Facilities Council.

    Dr Philip Wiseman receives funding from the Science and Technology Facilities Council

    ref. Could the first images from the Vera Rubin telescope change how we view space for good? – https://theconversation.com/could-the-first-images-from-the-vera-rubin-telescope-change-how-we-view-space-for-good-259857

    MIL OSI Analysis

  • MIL-OSI USA: Podcast: Angst Over Additives in Our Food

    Source: US State of Connecticut

    The “UConn Health Pulse” podcast brings a variety of expertise on health topics to the general public.

    In light of efforts at the federal level to restrict certain food dyes, it can be challenging to separate fact from myth when it comes to things in our food that aren’t naturally occurring.

    Dr. Rebecca Andrews, UConn Health primary care physician whose roles include director of primary care, associate program director of UConn’s Internal Medicine Residency, and nationally, chair of the American College of Physicians Board of Regents, has been following the science, and joins the “UConn Health Pulse” podcast to help differentiate between the potential risks of food additives and the benefits of natural, whole foods, why we may be paying more attention to this now, and how to navigate the noise to try to make good choices.

    If you make foods more attractive, more tasty, and they make us feel good, we can develop almost food addictions or unhealthy eating. &#8212 Dr. Rebecca Andrews

    Listen now:

    See transcript.

    MIL OSI USA News

  • MIL-OSI Analysis: Thimerosal discouraged in US flu vaccines, breaking with WHO guidance

    Source: The Conversation – UK – By Edward Beamer, Lecturer, Pharmacology, Sheffield Hallam University

    A federal vaccine panel, recently reshaped by US health secretary Robert F. Kennedy Jr., has voted to discourage the use of flu vaccines containing thimerosal, a mercury-based preservative. The decision marks a dramatic shift in vaccine policy, as thimerosal has long been considered safe by health agencies worldwide, with its use already limited to a few multi-dose flu shots.

    RFK Jr. has long linked thimerosal to autism – a connection that extensive scientific research has thoroughly debunked.

    Thimerosal is an organic chemical containing mercury, used as a preservative in vaccines since the 1930s. Its effect comes from the mercury that disrupts the function of enzymes in microbes, such as bacteria and fungi. This prevents contamination of vaccines while they are stored in vials. Mercury, however, is also well-known as a potent toxin acting on cells the brain.

    Much of mercury’s toxicity to brain cells stems from the same attributes that make thimerosal such a useful preservative. It disrupts the basic biological function of cells by changing the structure of proteins and enzymes.

    In the brain, this can lead neurons to become excessively active, can impair the way they use energy, it can increase inflammation and lead to the death of neurons. While mercury poisoning can damage brain function in adults, babies are even more vulnerable.

    People have long understood that mercury is toxic. But in the latter half of the 20th century, scientists discovered that industrial mercury entered rivers and seas, accumulating in the tissues of fish and shellfish. The neurological consequences of consuming too much contaminated seafood could be severe. This led environmental scientists to determine safe levels of mercury exposure.

    Anxiety about mercury in vaccines intensified when it was noticed that some children receiving multiple vaccines could exceed established safety limits for mercury exposure. These limits were based on environmental toxicity studies. How mercury affects the brain, though, depends very much on the chemical form in which it is ingested.

    In the 20th century, scientists discovered that mercury accumulates in the fish that we eat.
    J nel/Shutterstock.com

    Methylmercury v ethylmercury

    The form of mercury that contaminates the environment as a consequence of industrial processes is methylmercury. The form that is part of thimerosal is ethylmercury.

    The structure of these molecules differs in subtle but important ways. Methylmercury has one more carbon atom and two more hydrogen atoms than ethylmercury. These small differences significantly affect how each compound behaves in the body, particularly, in how easily they dissolve in fats.

    Fat solubility is a key consideration in pharmacokinetics – the science of how drugs and other molecules travel through the body. Briefly, because cell membranes are made of fatty substances, a molecule’s ability to dissolve in fats strongly influences how it crosses these membranes and moves through the body.

    It affects how a molecule is absorbed into the blood, how it is distributed to different tissues, how it is broken down by the body into other chemicals and how it is excreted.

    Methylmercury from environmental contamination is more fat-soluble than ethylmercury from thimerosal. This means that it accumulates more easily in tissues, and is excreted from the body more slowly.

    It also means that it can more easily cross into the brain and accumulate at greater concentrations for longer. For this reason, the safety guidelines that were established for methylmercury were unlikely to accurately predict the safety of ethylmercury.

    Global policy shift amid public fear

    Nevertheless, concerns about vaccine hesitancy, rising autism diagnoses and fears of a potential link to childhood vaccines led to thimerosal being almost entirely removed from childhood vaccines in the US by 2001 and in the UK between 2003 and 2005.

    Beyond biological considerations, policymakers were also responding to concerns about how vaccine fears could undermine immunisation efforts and fuel the spread of infectious diseases.

    Denmark, which removed thimerosal from childhood vaccines in 1992, provided an early opportunity to study the issue. Researchers compared the rates of autism before and after thimerosal’s removal as well as compared with similar countries still using it. Several large studies demonstrated conclusively that thimerosal was not causing autism or neurodevelopmental harm.

    Despite the overwhelming evidence that thimerosal is safe, it is no longer widely used in childhood vaccines in high-income countries, replaced by preservative-free vaccines, which must be stored as a single dose per vial.

    Storing multiple doses of a vaccine in the same vial, however, is still an extremely useful approach in resource-limited settings, in pandemics and where diseases require rapid, large-scale vaccination campaigns – common with influenza.

    International health bodies, including the World Health Organization, continue to support thimerosal’s use. They emphasise that the benefits of immunisation far outweigh the theoretical risks from low-dose ethylmercury exposure.

    Edward Beamer does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

    ref. Thimerosal discouraged in US flu vaccines, breaking with WHO guidance – https://theconversation.com/thimerosal-discouraged-in-us-flu-vaccines-breaking-with-who-guidance-259609

    MIL OSI Analysis

  • MIL-OSI Analysis: Hurricane Helene set up future disasters, from landslides to flooding – cascading hazards like these are upending risk models

    Source: The Conversation – USA – By Brian J. Yanites, Associate Professor of Earth and Atmospheric Science. Professor of Surficial and Sedimentary Geology, Indiana University

    The Carter Lodge hangs precariously over the flood-scoured bank of the Broad River in Chimney Rock Village, N.C., on May 13, 2025, eight months after Hurricane Helene. AP Photo/Allen G. Breed

    Hurricane Helene lasted only a few days in September 2024, but it altered the landscape of the Southeastern U.S. in profound ways that will affect the hazards local residents face far into the future.

    Mudslides buried roads and reshaped river channels. Uprooted trees left soil on hillslopes exposed to the elements. Sediment that washed into rivers changed how water flows through the landscape, leaving some areas more prone to flooding and erosion.

    Helene was a powerful reminder that natural hazards don’t disappear when the skies clear – they evolve.

    These transformations are part of what scientists call cascading hazards. They occur when one natural event alters the landscape in ways that lead to future hazards. A landslide triggered by a storm might clog a river, leading to downstream flooding months or years later. A wildfire can alter the soil and vegetation, setting the stage for debris flows with the next rainstorm.

    Satellite images before (top) and after Hurricane Helene (bottom) show how the storm altered landscape near Pensacola, N.C., in the Blue Ridge Mountains.
    Google Earth, CC BY

    I study these disasters as a geomorphologist. In a new paper in the journal Science, I and a team of scientists from 18 universities and the U.S. Geological Survey explain why hazard models – used to help communities prepare for disasters – can’t just rely on the past. Instead, they need to be nimble enough to forecast how hazards evolve in real time.

    The science behind cascading hazards

    Cascading hazards aren’t random. They emerge from physical processes that operate continuously across the landscape – sediment movement, weathering, erosion. Together, the atmosphere, biosphere and the earth are constantly reshaping the conditions that cause natural disasters.

    For instance, earthquakes fracture rock and shake loose soil. Even if landslides don’t occur during the quake itself, the ground may be weakened, leaving it primed for failure during later rainstorms.

    That’s exactly what happened after the 2008 earthquake in Sichuan Province, China, which led to a surge in debris flows long after the initial seismic event.

    A strong aftershock after a 7.8 magnitude earthquake in Sichuan province, China, in May 2008 triggered more landslides in central China.
    AP Photo/Andy Wong

    Earth’s surface retains a “memory” of these events. Sediment disturbed in an earthquake, wildfire or severe storm will move downslope over years or even decades, reshaping the landscape as it goes.

    The 1950 Assam earthquake in India is a striking example: It triggered thousands of landslides. The sediment from these landslides gradually moved through the river system, eventually causing flooding and changing river channels in Bangladesh some 20 years later.

    An intensifying threat in a changing world

    These risks present challenges for everything from emergency planning to home insurance. After repeated wildfire-mudslide combinations in California, some insurers pulled out of the state entirely, citing mounting risks and rising costs among the reasons.

    Cascading hazards are not new, but their impact is intensifying.

    Climate change is increasing the frequency and severity of wildfires, storms and extreme rainfall. At the same time, urban development continues to expand into steep, hazard-prone terrain, exposing more people and infrastructure to evolving risks.

    The rising risk of interconnected climate disasters like these is overwhelming systems built for isolated events.

    Yet climate change is only part of the equation. Earth processes – such as earthquakes and volcanic eruptions – also trigger cascading hazards, often with long-lasting effects.

    Mount St. Helens is a powerful example: More than four decades after its eruption in 1980, the U.S. Army Corps of Engineers continues to manage ash and sediment from the eruption to keep it from filling river channels in ways that could increase the flood risk in downstream communities.

    Rethinking risk and building resilience

    Traditionally, insurance companies and disaster managers have estimated hazard risk by looking at past events.

    But when the landscape has changed, the past may no longer be a reliable guide to the future. To address this, computer models based on the physics of how these events work are needed to help forecast hazard evolution in real time, much like weather models update with new atmospheric data.

    A March 2024 landslide in the Oregon Coast Range wiped out trees in its path.
    Brian Yanites, June 2025
    A drone image of the same March 2024 landslide in the Oregon Coast Range shows where it temporarily dammed the river below.
    Brian Yanites, June 2025

    Thanks to advances in Earth observation technology, such as satellite imagery, drone and lidar, which is similar to radar but uses light, scientists can now track how hillslopes, rivers and vegetation change after disasters. These observations can feed into geomorphic models that simulate how loosened sediment moves and where hazards are likely to emerge next.

    Researchers are already coupling weather forecasts with post-wildfire debris flow models. Other models simulate how sediment pulses travel through river networks.

    Cascading hazards reveal that Earth’s surface is not a passive backdrop, but an active, evolving system. Each event reshapes the stage for the next.

    Understanding these connections is critical for building resilience so communities can withstand future storms, earthquakes and the problems created by debris flows. Better forecasts can inform building codes, guide infrastructure design and improve how risk is priced and managed. They can help communities anticipate long-term threats and adapt before the next disaster strikes.

    Most importantly, they challenge everyone to think beyond the immediate aftermath of a disaster – and to recognize the slow, quiet transformations that build toward the next.

    Brian J. Yanites receives funding from the National Science Foundation.

    ref. Hurricane Helene set up future disasters, from landslides to flooding – cascading hazards like these are upending risk models – https://theconversation.com/hurricane-helene-set-up-future-disasters-from-landslides-to-flooding-cascading-hazards-like-these-are-upending-risk-models-259502

    MIL OSI Analysis

  • MIL-OSI Analysis: Checking in on New England’s fishing industry 25 Years after ‘The Perfect Storm’ hit movie theaters

    Source: The Conversation – USA – By Stephanie Otts, Director of National Sea Grant Law Center, University of Mississippi

    Filming ‘The Perfect Storm’ in Gloucester Harbor, Mass.
    The Salem News Historic Photograph Collection, Salem State University Archives and Special Collections, CC BY

    Twenty-five years ago, “The Perfect Storm” roared into movie theaters. The disaster flick, starring George Clooney and Mark Wahlberg, was a riveting, fictionalized account of commercial swordfishing in New England and a crew who went down in a violent storm.

    The anniversary of the film’s release, on June 30, 2000, provides an opportunity to reflect on the real-life changes to New England’s commercial fishing industry.

    Fishing was once more open to all

    In the true story behind the movie, six men lost their lives in late October 1991 when the commercial swordfishing vessel Andrea Gail disappeared in a fierce storm in the North Atlantic as it was headed home to Gloucester, Massachusetts.

    At the time, and until very recently, almost all commercial fisheries were open access, meaning there were no restrictions on who could fish.

    There were permit requirements and regulations about where, when and how you could fish, but anyone with the means to purchase a boat and associated permits, gear, bait and fuel could enter the fishery. Eight regional councils established under a 1976 federal law to manage fisheries around the U.S. determined how many fish could be harvested prior to the start of each fishing season.

    Fishing has been an integral part of coastal New England culture since its towns were established. In this 1899 photo, a New England community weighs and packs mackerel.
    Charles Stevenson/Freshwater and Marine Image Bank

    Fishing started when the season opened and continued until the catch limit was reached. In some fisheries, this resulted in a “race to the fish” or a “derby,” where vessels competed aggressively to harvest the available catch in short amounts of time. The limit could be reached in a single day, as happened in the Pacific halibut fishery in the late 1980s.

    By the 1990s, however, open access systems were coming under increased criticism from economists as concerns about overfishing rose.

    The fish catch peaked in New England in 1987 and would remain far above what the fish population could sustain for two more decades. Years of overfishing led to the collapse of fish stocks, including North Atlantic cod in 1992 and Pacific sardine in 2015.

    As populations declined, managers responded by cutting catch limits to allow more fish to survive and reproduce. Fishing seasons were shortened, as it took less time for the fleets to harvest the allowed catch. It became increasingly hard for fishermen to catch enough fish to earn a living.

    Saving fisheries changed the industry

    In the early 2000s, as these economic and environmental challenges grew, fisheries managers started limiting access. Instead of allowing anyone to fish, only vessels or individuals meeting certain eligibility requirements would have the right to fish.

    The most common method of limiting access in the U.S. is through limited entry permits, initially awarded to individuals or vessels based on previous participation or success in the fishery. Another approach is to assign individual harvest quotas or “catch shares” to permit holders, limiting how much each boat can bring in.

    In 2007, Congress amended the 1976 Magnuson-Stevens Fishery Conservation and Management Act to promote the use of limited access programs in U.S. fisheries.

    Ships in the fleet out of New Bedford, Mass.
    Henry Zbyszynski/Flickr, CC BY

    Today, limited access is common, and there are positive signs that the management change is helping achieve the law’s environmental goal of preventing overfishing. Since 2000, the populations of 50 major fishing stocks have been rebuilt, meaning they have recovered to a level that can once again support fishing.

    I’ve been following the changes as a lawyer focused on ocean and coastal issues, and I see much work still to be done.

    Forty fish stocks are currently being managed under rebuilding plans that limit catch to allow the stock to grow, including Atlantic cod, which has struggled to recover due to a complex combination of factors, including climatic changes.

    The lingering effect on communities today

    While many fish stocks have recovered, the effort came at an economic cost to many individual fishermen. The limited-access Northeast groundfish fishery, which includes Atlantic cod, haddock and flounder, shed nearly 800 crew positions between 2007 and 2015.

    The loss of jobs and revenue from fishing impacts individual family income and relationships, strains other businesses in fishing communities, and affects those communities’ overall identity and resilience, as illustrated by a recent economic snapshot of the Alaska seafood industry.

    When original limited-access permit holders leave the business – for economic, personal or other reasons – their permits are either terminated or sold to other eligible permit holders, leading to fewer active vessels in the fleet. As a result, the number of vessels fishing for groundfish has declined from 719 in 2007 to 194 in 2023, meaning fewer jobs.

    A fisherman unloads a portion of his catch for the day of 300 pounds of groundfish, including flounder, in January 2006 in Gloucester, Mass.
    AP Photo/Lisa Poole

    Because of their scarcity, limited-access permits can cost upward of US$500,000, which is often beyond the financial means of a small businesses or a young person seeking to enter the industry. The high prices may also lead retiring fishermen to sell their permits, as opposed to passing them along with the vessels to the next generation.

    These economic forces have significantly altered the fishing industry, leading to more corporate and investor ownership, rather than the family-owned operations that were more common in the Andrea Gail’s time.

    Similar to the experience of small family farms, fishing captains and crews are being pushed into corporate arrangements that reduce their autonomy and revenues.

    Consolidation can threaten the future of entire fleets, as New Bedford, Massachusetts, saw when Blue Harvest Fisheries, backed by a private equity firm, bought up vessels and other assets and then declared bankruptcy a few years later, leaving a smaller fleet and some local business and fishermen unpaid for their work. A company with local connections bought eight vessels from Blue Harvest along with 48 state and federal permits the company held.

    New challenges and unchanging risks

    While there are signs of recovery for New England’s fisheries, challenges continue.

    Warming water temperatures have shifted the distribution of some species, affecting where and when fish are harvested. For example, lobsters have moved north toward Canada. When vessels need to travel farther to find fish, that increases fuel and supply costs and time away from home.

    Fisheries managers will need to continue to adapt to keep New England’s fisheries healthy and productive.

    One thing that, unfortunately, hasn’t changed is the dangerous nature of the occupation. Between 2000 and 2019, 414 fishermen died in 245 disasters.

    Stephanie Otts receives funding from the NOAA National Sea Grant College Program through the U.S. Department of Commerce. Previous support for fisheries management legal research provided by The Nature Conservancy.

    ref. Checking in on New England’s fishing industry 25 Years after ‘The Perfect Storm’ hit movie theaters – https://theconversation.com/checking-in-on-new-englands-fishing-industry-25-years-after-the-perfect-storm-hit-movie-theaters-255076

    MIL OSI Analysis

  • MIL-OSI Analysis: Hurricane Helene set up future disasters, from landslides to flooding – cascading hazards like these are now upending risk models

    Source: The Conversation – USA – By Brian J. Yanites, Associate Professor of Earth and Atmospheric Science. Professor of Surficial and Sedimentary Geology, Indiana University

    The Carter Lodge hangs precariously over the flood-scoured bank of the Broad River in Chimney Rock Village, N.C., on May 13, 2025, eight months after Hurricane Helene. AP Photo/Allen G. Breed

    Hurricane Helene lasted only a few days in September 2024, but it altered the landscape of the Southeastern U.S. in profound ways that will affect the hazards local residents face far into the future.

    Mudslides buried roads and reshaped river channels. Uprooted trees left soil on hillslopes exposed to the elements. Sediment that washed into rivers changed how water flows through the landscape, leaving some areas more prone to flooding and erosion.

    Helene was a powerful reminder that natural hazards don’t disappear when the skies clear – they evolve.

    These transformations are part of what scientists call cascading hazards. They occur when one natural event alters the landscape in ways that lead to future hazards. A landslide triggered by a storm might clog a river, leading to downstream flooding months or years later. A wildfire can alter the soil and vegetation, setting the stage for debris flows with the next rainstorm.

    Satellite images before (top) and after Hurricane Helene (bottom) show how the storm altered landscape near Pensacola, N.C., in the Blue Ridge Mountains.
    Google Earth, CC BY

    I study these disasters as a geomorphologist. In a new paper in the journal Science, I and a team of scientists from 18 universities and the U.S. Geological Survey explain why hazard models – used to help communities prepare for disasters – can’t just rely on the past. Instead, they need to be nimble enough to forecast how hazards evolve in real time.

    The science behind cascading hazards

    Cascading hazards aren’t random. They emerge from physical processes that operate continuously across the landscape – sediment movement, weathering, erosion. Together, the atmosphere, biosphere and the earth are constantly reshaping the conditions that cause natural disasters.

    For instance, earthquakes fracture rock and shake loose soil. Even if landslides don’t occur during the quake itself, the ground may be weakened, leaving it primed for failure during later rainstorms.

    That’s exactly what happened after the 2008 earthquake in Sichuan Province, China, which led to a surge in debris flows long after the initial seismic event.

    A strong aftershock after a 7.8 magnitude earthquake in Sichuan province, China, in May 2008 triggered more landslides in central China.
    AP Photo/Andy Wong

    Earth’s surface retains a “memory” of these events. Sediment disturbed in an earthquake, wildfire or severe storm will move downslope over years or even decades, reshaping the landscape as it goes.

    The 1950 Assam earthquake in India is a striking example: It triggered thousands of landslides. The sediment from these landslides gradually moved through the river system, eventually causing flooding and changing river channels in Bangladesh some 20 years later.

    An intensifying threat in a changing world

    These risks present challenges for everything from emergency planning to home insurance. After repeated wildfire-mudslide combinations in California, some insurers pulled out of the state entirely, citing mounting risks and rising costs among the reasons.

    Cascading hazards are not new, but their impact is intensifying.

    Climate change is increasing the frequency and severity of wildfires, storms and extreme rainfall. At the same time, urban development continues to expand into steep, hazard-prone terrain, exposing more people and infrastructure to evolving risks.

    The rising risk of interconnected climate disasters like these is overwhelming systems built for isolated events.

    Yet climate change is only part of the equation. Earth processes – such as earthquakes and volcanic eruptions – also trigger cascading hazards, often with long-lasting effects.

    Mount St. Helens is a powerful example: More than four decades after its eruption in 1980, the U.S. Army Corps of Engineers continues to manage ash and sediment from the eruption to keep it from filling river channels in ways that could increase the flood risk in downstream communities.

    Rethinking risk and building resilience

    Traditionally, insurance companies and disaster managers have estimated hazard risk by looking at past events.

    But when the landscape has changed, the past may no longer be a reliable guide to the future. To address this, computer models based on the physics of how these events work are needed to help forecast hazard evolution in real time, much like weather models update with new atmospheric data.

    A March 2024 landslide in the Oregon Coast Range wiped out trees in its path.
    Brian Yanites, June 2025
    A drone image of the same March 2024 landslide in the Oregon Coast Range shows where it temporarily dammed the river below.
    Brian Yanites, June 2025

    Thanks to advances in Earth observation technology, such as satellite imagery, drone and lidar, which is similar to radar but uses light, scientists can now track how hillslopes, rivers and vegetation change after disasters. These observations can feed into geomorphic models that simulate how loosened sediment moves and where hazards are likely to emerge next.

    Researchers are already coupling weather forecasts with post-wildfire debris flow models. Other models simulate how sediment pulses travel through river networks.

    Cascading hazards reveal that Earth’s surface is not a passive backdrop, but an active, evolving system. Each event reshapes the stage for the next.

    Understanding these connections is critical for building resilience so communities can withstand future storms, earthquakes and the problems created by debris flows. Better forecasts can inform building codes, guide infrastructure design and improve how risk is priced and managed. They can help communities anticipate long-term threats and adapt before the next disaster strikes.

    Most importantly, they challenge everyone to think beyond the immediate aftermath of a disaster – and to recognize the slow, quiet transformations that build toward the next.

    Brian J. Yanites receives funding from the National Science Foundation.

    ref. Hurricane Helene set up future disasters, from landslides to flooding – cascading hazards like these are now upending risk models – https://theconversation.com/hurricane-helene-set-up-future-disasters-from-landslides-to-flooding-cascading-hazards-like-these-are-now-upending-risk-models-259502

    MIL OSI Analysis

  • MIL-OSI Russia: At SPbGASU, schoolchildren were presented with certificates of their first profession

    Translation. Region: Russian Federal

    Source: Saint Petersburg State University of Architecture and Civil Engineering – Saint Petersburg State University of Architecture and Civil Engineering – Congratulations to the graduate

    On June 24, a ceremony was held in the meeting room of the Academic Council of SPbGASU to present certificates, certificates of obtaining the first profession and certificates of completion of training at SPbGASU to ninth-grade graduates of School No. 334 in the Nevsky District of St. Petersburg.

    The dean of the automobile and road engineering faculty of SPbGASU Andrey Zazykin addressed the graduates with welcoming words, conveying congratulations from the rector Evgeny Rybnov and noting that the students have many professional achievements and personal victories ahead of them, and the experience they have gained will help them make the right choice of their future profession.

    Advisor to the General Director of St. Petersburg State Unitary Enterprise “Passazhiravtotrans” Sergey Maevsky invited the guys to join his enterprise in the future – the largest passenger transport operator in the North-West region.

    Principal of School No. 334 Natalia Nagaichenko addressed the graduates with a farewell speech: “Remember the names of those who made our country famous with their discoveries and inventions, whose works became the foundation for the development of science and technology. They left us a rich heritage, which we can rightfully be proud of! But pride in the past is only a starting point. True strength is in the desire to surpass what has been achieved, in the desire to make a contribution to the future. Believe in yourself, in your potential, in the power of Russian science and engineering! Go forward, to new heights, for the benefit of the city and the country!”

    In autumn 2023, SPbGASU, in cooperation with the Center for Advanced Professional Training, SPb GBPOU “Academy of Transport Technologies” and key partners – Renga Softvea LLC, St. Petersburg State Unitary Enterprise “Passazhiravtotrans” and EVROAVTO LLC, began implementing the Engineering Classes project.

    On the basis of the 334th school, a TIM class and the first motor transport class in the Northern capital were created. The main goal of the engineering classes is early career guidance and training of engineering personnel, as well as building a chain of sustainable interaction “school – college – university – employer”.

    Over the course of two years, schoolchildren mastered additional general development programs and a vocational training program, for which they successfully passed the final certification.

    In a ceremonial atmosphere, Anna Samodelkina, Senior Methodologist of the Center for Advanced Professional Training of St. Petersburg, Natalia Khlebova, Head of the Career Guidance and Employment Department of the Academy of Transport Technologies, Igor Chernyaev, Head of the Department of Technical Operation of Vehicles of St. Petersburg State University of Architecture and Civil Engineering, Roman Litvin, Associate Professor of the Department of Ground Transport and Technological Machines of St. Petersburg State University of Architecture and Civil Engineering, and Ekaterina Kopylova, Deputy Director of the Institute of Continuing Education of St. Petersburg State University of Architecture and Civil Engineering, presented graduates with certificates of completion of training in additional general development programs of St. Petersburg State University of Architecture and Civil Engineering and certificates of the first profession of “Automobile Repair Mechanic”, “Draftsman-Designer” and “Inspector of Purchased Components”.

    Please note: This information is raw content directly from the source of the information. It is exactly what the source states and does not reflect the position of MIL-OSI or its clients.

    MIL OSI Russia News

  • MIL-OSI Analysis: Natural disasters don’t disappear when the storm ends or the earthquake stops – they evolve

    Source: The Conversation – USA – By Brian J. Yanites, Associate Professor of Earth and Atmospheric Science. Professor of Surficial and Sedimentary Geology, Indiana University

    The Carter Lodge hangs precariously over the flood-scoured bank of the Broad River in Chimney Rock Village, N.C., on May 13, 2025, eight months after Hurricane Helene. AP Photo/Allen G. Breed

    Hurricane Helene lasted only a few days in September 2024, but it altered the landscape of the Southeastern U.S. in profound ways that will affect the hazards local residents face far into the future.

    Mudslides buried roads and reshaped river channels. Uprooted trees left soil on hillslopes exposed to the elements. Sediment that washed into rivers changed how water flows through the landscape, leaving some areas more prone to flooding and erosion.

    Helene was a powerful reminder that natural hazards don’t disappear when the skies clear – they evolve.

    These transformations are part of what scientists call cascading hazards. They occur when one natural event alters the landscape in ways that lead to future hazards. A landslide triggered by a storm might clog a river, leading to downstream flooding months or years later. A wildfire can alter the soil and vegetation, setting the stage for debris flows with the next rainstorm.

    Satellite images before (top) and after Hurricane Helene (bottom) show how the storm altered landscape near Pensacola, N.C., in the Blue Ridge Mountains.
    Google Earth, CC BY

    I study these disasters as a geomorphologist. In a new paper in the journal Science, I and a team of scientists from 18 universities and the U.S. Geological Survey explain why hazard models – used to help communities prepare for disasters – can’t just rely on the past. Instead, they need to be nimble enough to forecast how hazards evolve in real time.

    The science behind cascading hazards

    Cascading hazards aren’t random. They emerge from physical processes that operate continuously across the landscape – sediment movement, weathering, erosion. Together, the atmosphere, biosphere and the earth are constantly reshaping the conditions that cause natural disasters.

    For instance, earthquakes fracture rock and shake loose soil. Even if landslides don’t occur during the quake itself, the ground may be weakened, leaving it primed for failure during later rainstorms.

    That’s exactly what happened after the 2008 earthquake in Sichuan Province, China, which led to a surge in debris flows long after the initial seismic event.

    A strong aftershock after a 7.8 magnitude earthquake in Sichuan province, China, in May 2008 triggered more landslides in central China.
    AP Photo/Andy Wong

    Earth’s surface retains a “memory” of these events. Sediment disturbed in an earthquake, wildfire or severe storm will move downslope over years or even decades, reshaping the landscape as it goes.

    The 1950 Assam earthquake in India is a striking example: It triggered thousands of landslides. The sediment from these landslides gradually moved through the river system, eventually causing flooding and changing river channels in Bangladesh some 20 years later.

    An intensifying threat in a changing world

    These risks present challenges for everything from emergency planning to home insurance. After repeated wildfire-mudslide combinations in California, some insurers pulled out of the state entirely, citing mounting risks and rising costs among the reasons.

    Cascading hazards are not new, but their impact is intensifying.

    Climate change is increasing the frequency and severity of wildfires, storms and extreme rainfall. At the same time, urban development continues to expand into steep, hazard-prone terrain, exposing more people and infrastructure to evolving risks.

    The rising risk of interconnected climate disasters like these is overwhelming systems built for isolated events.

    Yet climate change is only part of the equation. Earth processes – such as earthquakes and volcanic eruptions – also trigger cascading hazards, often with long-lasting effects.

    Mount St. Helens is a powerful example: More than four decades after its eruption in 1980, the U.S. Army Corps of Engineers continues to manage ash and sediment from the eruption to keep it from filling river channels in ways that could increase the flood risk in downstream communities.

    Rethinking risk and building resilience

    Traditionally, insurance companies and disaster managers have estimated hazard risk by looking at past events.

    But when the landscape has changed, the past may no longer be a reliable guide to the future. To address this, computer models based on the physics of how these events work are needed to help forecast hazard evolution in real time, much like weather models update with new atmospheric data.

    A March 2024 landslide in the Oregon Coast Range wiped out trees in its path.
    Brian Yanites, June 2025
    A drone image of the same March 2024 landslide in the Oregon Coast Range shows where it temporarily dammed the river below.
    Brian Yanites, June 2025

    Thanks to advances in Earth observation technology, such as satellite imagery, drone and lidar, which is similar to radar but uses light, scientists can now track how hillslopes, rivers and vegetation change after disasters. These observations can feed into geomorphic models that simulate how loosened sediment moves and where hazards are likely to emerge next.

    Researchers are already coupling weather forecasts with post-wildfire debris flow models. Other models simulate how sediment pulses travel through river networks.

    Cascading hazards reveal that Earth’s surface is not a passive backdrop, but an active, evolving system. Each event reshapes the stage for the next.

    Understanding these connections is critical for building resilience so communities can withstand future storms, earthquakes and the problems created by debris flows. Better forecasts can inform building codes, guide infrastructure design and improve how risk is priced and managed. They can help communities anticipate long-term threats and adapt before the next disaster strikes.

    Most importantly, they challenge everyone to think beyond the immediate aftermath of a disaster – and to recognize the slow, quiet transformations that build toward the next.

    Brian J. Yanites receives funding from the National Science Foundation.

    ref. Natural disasters don’t disappear when the storm ends or the earthquake stops – they evolve – https://theconversation.com/natural-disasters-dont-disappear-when-the-storm-ends-or-the-earthquake-stops-they-evolve-259502

    MIL OSI Analysis

  • MIL-OSI Analysis: Natural disasters don’t disappear when the storm ends or the earthquake stops – they evolve

    Source: The Conversation – USA – By Brian J. Yanites, Associate Professor of Earth and Atmospheric Science. Professor of Surficial and Sedimentary Geology, Indiana University

    The Carter Lodge hangs precariously over the flood-scoured bank of the Broad River in Chimney Rock Village, N.C., on May 13, 2025, eight months after Hurricane Helene. AP Photo/Allen G. Breed

    Hurricane Helene lasted only a few days in September 2024, but it altered the landscape of the Southeastern U.S. in profound ways that will affect the hazards local residents face far into the future.

    Mudslides buried roads and reshaped river channels. Uprooted trees left soil on hillslopes exposed to the elements. Sediment that washed into rivers changed how water flows through the landscape, leaving some areas more prone to flooding and erosion.

    Helene was a powerful reminder that natural hazards don’t disappear when the skies clear – they evolve.

    These transformations are part of what scientists call cascading hazards. They occur when one natural event alters the landscape in ways that lead to future hazards. A landslide triggered by a storm might clog a river, leading to downstream flooding months or years later. A wildfire can alter the soil and vegetation, setting the stage for debris flows with the next rainstorm.

    Satellite images before (top) and after Hurricane Helene (bottom) show how the storm altered landscape near Pensacola, N.C., in the Blue Ridge Mountains.
    Google Earth, CC BY

    I study these disasters as a geomorphologist. In a new paper in the journal Science, I and a team of scientists from 18 universities and the U.S. Geological Survey explain why hazard models – used to help communities prepare for disasters – can’t just rely on the past. Instead, they need to be nimble enough to forecast how hazards evolve in real time.

    The science behind cascading hazards

    Cascading hazards aren’t random. They emerge from physical processes that operate continuously across the landscape – sediment movement, weathering, erosion. Together, the atmosphere, biosphere and the earth are constantly reshaping the conditions that cause natural disasters.

    For instance, earthquakes fracture rock and shake loose soil. Even if landslides don’t occur during the quake itself, the ground may be weakened, leaving it primed for failure during later rainstorms.

    That’s exactly what happened after the 2008 earthquake in Sichuan Province, China, which led to a surge in debris flows long after the initial seismic event.

    A strong aftershock after a 7.8 magnitude earthquake in Sichuan province, China, in May 2008 triggered more landslides in central China.
    AP Photo/Andy Wong

    Earth’s surface retains a “memory” of these events. Sediment disturbed in an earthquake, wildfire or severe storm will move downslope over years or even decades, reshaping the landscape as it goes.

    The 1950 Assam earthquake in India is a striking example: It triggered thousands of landslides. The sediment from these landslides gradually moved through the river system, eventually causing flooding and changing river channels in Bangladesh some 20 years later.

    An intensifying threat in a changing world

    These risks present challenges for everything from emergency planning to home insurance. After repeated wildfire-mudslide combinations in California, some insurers pulled out of the state entirely, citing mounting risks and rising costs among the reasons.

    Cascading hazards are not new, but their impact is intensifying.

    Climate change is increasing the frequency and severity of wildfires, storms and extreme rainfall. At the same time, urban development continues to expand into steep, hazard-prone terrain, exposing more people and infrastructure to evolving risks.

    The rising risk of interconnected climate disasters like these is overwhelming systems built for isolated events.

    Yet climate change is only part of the equation. Earth processes – such as earthquakes and volcanic eruptions – also trigger cascading hazards, often with long-lasting effects.

    Mount St. Helens is a powerful example: More than four decades after its eruption in 1980, the U.S. Army Corps of Engineers continues to manage ash and sediment from the eruption to keep it from filling river channels in ways that could increase the flood risk in downstream communities.

    Rethinking risk and building resilience

    Traditionally, insurance companies and disaster managers have estimated hazard risk by looking at past events.

    But when the landscape has changed, the past may no longer be a reliable guide to the future. To address this, computer models based on the physics of how these events work are needed to help forecast hazard evolution in real time, much like weather models update with new atmospheric data.

    A March 2024 landslide in the Oregon Coast Range wiped out trees in its path.
    Brian Yanites, June 2025
    A drone image of the same March 2024 landslide in the Oregon Coast Range shows where it temporarily dammed the river below.
    Brian Yanites, June 2025

    Thanks to advances in Earth observation technology, such as satellite imagery, drone and lidar, which is similar to radar but uses light, scientists can now track how hillslopes, rivers and vegetation change after disasters. These observations can feed into geomorphic models that simulate how loosened sediment moves and where hazards are likely to emerge next.

    Researchers are already coupling weather forecasts with post-wildfire debris flow models. Other models simulate how sediment pulses travel through river networks.

    Cascading hazards reveal that Earth’s surface is not a passive backdrop, but an active, evolving system. Each event reshapes the stage for the next.

    Understanding these connections is critical for building resilience so communities can withstand future storms, earthquakes and the problems created by debris flows. Better forecasts can inform building codes, guide infrastructure design and improve how risk is priced and managed. They can help communities anticipate long-term threats and adapt before the next disaster strikes.

    Most importantly, they challenge everyone to think beyond the immediate aftermath of a disaster – and to recognize the slow, quiet transformations that build toward the next.

    Brian J. Yanites receives funding from the National Science Foundation.

    ref. Natural disasters don’t disappear when the storm ends or the earthquake stops – they evolve – https://theconversation.com/natural-disasters-dont-disappear-when-the-storm-ends-or-the-earthquake-stops-they-evolve-259502

    MIL OSI Analysis

  • MIL-OSI Analysis: Natural disasters don’t disappear when the storm ends or the earthquake stops – they evolve

    Source: The Conversation – USA – By Brian J. Yanites, Associate Professor of Earth and Atmospheric Science. Professor of Surficial and Sedimentary Geology, Indiana University

    The Carter Lodge hangs precariously over the flood-scoured bank of the Broad River in Chimney Rock Village, N.C., on May 13, 2025, eight months after Hurricane Helene. AP Photo/Allen G. Breed

    Hurricane Helene lasted only a few days in September 2024, but it altered the landscape of the Southeastern U.S. in profound ways that will affect the hazards local residents face far into the future.

    Mudslides buried roads and reshaped river channels. Uprooted trees left soil on hillslopes exposed to the elements. Sediment that washed into rivers changed how water flows through the landscape, leaving some areas more prone to flooding and erosion.

    Helene was a powerful reminder that natural hazards don’t disappear when the skies clear – they evolve.

    These transformations are part of what scientists call cascading hazards. They occur when one natural event alters the landscape in ways that lead to future hazards. A landslide triggered by a storm might clog a river, leading to downstream flooding months or years later. A wildfire can alter the soil and vegetation, setting the stage for debris flows with the next rainstorm.

    Satellite images before (top) and after Hurricane Helene (bottom) show how the storm altered landscape near Pensacola, N.C., in the Blue Ridge Mountains.
    Google Earth, CC BY

    I study these disasters as a geomorphologist. In a new paper in the journal Science, I and a team of scientists from 18 universities and the U.S. Geological Survey explain why hazard models – used to help communities prepare for disasters – can’t just rely on the past. Instead, they need to be nimble enough to forecast how hazards evolve in real time.

    The science behind cascading hazards

    Cascading hazards aren’t random. They emerge from physical processes that operate continuously across the landscape – sediment movement, weathering, erosion. Together, the atmosphere, biosphere and the earth are constantly reshaping the conditions that cause natural disasters.

    For instance, earthquakes fracture rock and shake loose soil. Even if landslides don’t occur during the quake itself, the ground may be weakened, leaving it primed for failure during later rainstorms.

    That’s exactly what happened after the 2008 earthquake in Sichuan Province, China, which led to a surge in debris flows long after the initial seismic event.

    A strong aftershock after a 7.8 magnitude earthquake in Sichuan province, China, in May 2008 triggered more landslides in central China.
    AP Photo/Andy Wong

    Earth’s surface retains a “memory” of these events. Sediment disturbed in an earthquake, wildfire or severe storm will move downslope over years or even decades, reshaping the landscape as it goes.

    The 1950 Assam earthquake in India is a striking example: It triggered thousands of landslides. The sediment from these landslides gradually moved through the river system, eventually causing flooding and changing river channels in Bangladesh some 20 years later.

    An intensifying threat in a changing world

    These risks present challenges for everything from emergency planning to home insurance. After repeated wildfire-mudslide combinations in California, some insurers pulled out of the state entirely, citing mounting risks and rising costs among the reasons.

    Cascading hazards are not new, but their impact is intensifying.

    Climate change is increasing the frequency and severity of wildfires, storms and extreme rainfall. At the same time, urban development continues to expand into steep, hazard-prone terrain, exposing more people and infrastructure to evolving risks.

    The rising risk of interconnected climate disasters like these is overwhelming systems built for isolated events.

    Yet climate change is only part of the equation. Earth processes – such as earthquakes and volcanic eruptions – also trigger cascading hazards, often with long-lasting effects.

    Mount St. Helens is a powerful example: More than four decades after its eruption in 1980, the U.S. Army Corps of Engineers continues to manage ash and sediment from the eruption to keep it from filling river channels in ways that could increase the flood risk in downstream communities.

    Rethinking risk and building resilience

    Traditionally, insurance companies and disaster managers have estimated hazard risk by looking at past events.

    But when the landscape has changed, the past may no longer be a reliable guide to the future. To address this, computer models based on the physics of how these events work are needed to help forecast hazard evolution in real time, much like weather models update with new atmospheric data.

    A March 2024 landslide in the Oregon Coast Range wiped out trees in its path.
    Brian Yanites, June 2025
    A drone image of the same March 2024 landslide in the Oregon Coast Range shows where it temporarily dammed the river below.
    Brian Yanites, June 2025

    Thanks to advances in Earth observation technology, such as satellite imagery, drone and lidar, which is similar to radar but uses light, scientists can now track how hillslopes, rivers and vegetation change after disasters. These observations can feed into geomorphic models that simulate how loosened sediment moves and where hazards are likely to emerge next.

    Researchers are already coupling weather forecasts with post-wildfire debris flow models. Other models simulate how sediment pulses travel through river networks.

    Cascading hazards reveal that Earth’s surface is not a passive backdrop, but an active, evolving system. Each event reshapes the stage for the next.

    Understanding these connections is critical for building resilience so communities can withstand future storms, earthquakes and the problems created by debris flows. Better forecasts can inform building codes, guide infrastructure design and improve how risk is priced and managed. They can help communities anticipate long-term threats and adapt before the next disaster strikes.

    Most importantly, they challenge everyone to think beyond the immediate aftermath of a disaster – and to recognize the slow, quiet transformations that build toward the next.

    Brian J. Yanites receives funding from the National Science Foundation.

    ref. Natural disasters don’t disappear when the storm ends or the earthquake stops – they evolve – https://theconversation.com/natural-disasters-dont-disappear-when-the-storm-ends-or-the-earthquake-stops-they-evolve-259502

    MIL OSI Analysis

  • MIL-OSI: LivFresh 2025: How LivFresh Dental Gel Toothpaste Is Earning Clinical and Consumer Trust in Oral Health

    Source: GlobeNewswire (MIL-OSI)

    Los Gatos, CA, June 27, 2025 (GLOBE NEWSWIRE) — In the rapidly evolving world of oral care, LivFresh stands out as a breakthrough that merges clinically tested results with a refreshing user experience. From dental chairs to bathroom cabinets, LivFresh Dental Gel Toothpaste is quickly becoming a trusted name, earning confidence from healthcare professionals and everyday users alike. Backed by university-led studies and growing word-of-mouth traction, LivFresh is redefining the standards of plaque control and gum support in 2025.

    The Science Behind LivFresh: What Makes This Dental Gel Different?

    At the heart of LivFresh Dental Gel is a patented formulation developed by researchers at Livionex that disrupts plaque before it can bond to teeth and gums. Unlike conventional toothpaste, which relies on physical abrasives and foaming agents, LivFresh works at the molecular level—targeting the electrostatic attraction that causes plaque to stick. This scientific approach allows the gel to break down plaque-forming proteins without irritating gums or enamel.

    What sets LivFresh apart is its clean formulation: no sodium lauryl sulfate, no triclosan, and no peroxide. Instead, the gel uses safe, non-toxic compounds that have been clinically shown to reduce plaque by over 250% compared to traditional toothpaste. These results aren’t marketing claims—they’re peer-reviewed and published in respected dental journals.

    LivFresh is not only about prevention. It enhances overall oral hygiene by creating a surface on the teeth that is less likely to attract bacteria. This provides longer-lasting cleanliness and a smoother mouthfeel. In 2025, as consumers become more ingredient-conscious, LivFresh emerges as a standout in the category—where science leads and simplicity follows.

    Dentist Approved: What Leading Oral Health Professionals Say About LivFresh

    According to official website, Dental professionals are increasingly recommending LivFresh to patients concerned about plaque buildup, gum inflammation, and post-cleaning sensitivity. The appeal lies in its evidence-backed performance and its gentler, detergent-free formula. As oral health becomes more integrated into overall wellness, dentists are favoring products that support long-term gum health without harsh ingredients.

    Dr. Andrea Peterson, DDS, a practicing periodontist in Seattle, remarks, “LivFresh is one of the few products I recommend daily. Its formulation shows real impact on oral biofilm and gingival inflammation. It’s more than a toothpaste—it’s a preventive tool.”

    LivFresh’s rising credibility within clinical circles stems from the way it shifts the paradigm of oral care. It doesn’t just clean; it alters the environment in which bacteria thrive. That distinction is key for dental professionals treating patients with periodontal risk factors or sensitive oral conditions.

    With more dental offices integrating LivFresh into their post-procedure care kits and hygienists noticing improved check-up results among regular users, this gel has firmly positioned itself as more than a trend—it’s a clinical ally. For practitioners focused on proactive care, LivFresh offers a research-driven alternative to traditional products.

    Visit Official Website To get More Information

    Consumer Trust Grows: Why LivFresh Toothpaste Is Gaining Loyal Users Nationwide

    LivFresh isn’t just winning over dentists—it’s gaining momentum with everyday users who demand more from their oral care. The feedback from consumers has been overwhelmingly positive, with many reporting fresher breath, smoother teeth, and visibly reduced plaque in just weeks of use.

    Online forums like Reddit and dedicated oral health groups have helped spread the word organically. On platforms such as TikTok and Instagram, users are sharing before-and-after videos showcasing visibly cleaner teeth and improved gum health. One frequent sentiment: “It feels like I just left the dentist—every time I brush.”

    LivFresh is especially popular among those with sensitive mouths, braces, or a history of gingivitis. Many users express relief that the gel doesn’t burn, foam aggressively, or contain irritating ingredients. Its gentle nature combined with clinical strength results has earned the brand a reputation for trustworthiness.

    Repeat buyers now form the foundation of LivFresh’s growth, with subscription orders steadily increasing through the brand’s official site. In an age where consumer loyalty is earned, not assumed, LivFresh stands out by delivering a better brushing experience with verifiable benefits.

    Clinical Trials and Safety Profile

    LivFresh’s scientific credibility stems from its rigorous clinical validation. In multiple randomized controlled trials conducted by major dental schools, LivFresh Dental Gel demonstrated up to 2.5x greater plaque reduction compared to conventional fluoride toothpaste. These results were measured through precise plaque scoring systems used in periodontal studies.

    Importantly, these studies also showed a marked reduction in gingival bleeding and inflammation, two major indicators of gum disease. Participants noted both subjective improvements—such as smoother teeth and less sensitivity—and measurable changes validated by dental professionals.

    Safety has also been a core focus. LivFresh contains no artificial colors, sulfates, preservatives, or harsh foaming agents. Its active ingredients are considered Generally Recognized As Safe (GRAS) by the FDA. The gel is also pH balanced to maintain the oral microbiome and reduce enamel erosion risk.

    LivFresh’s formulation has passed toxicological assessments and dermatological tests, making it suitable for daily use, even among individuals with sensitive gums or a history of periodontal treatment. In a crowded marketplace where many products lean heavily on marketing, LivFresh’s clinical pedigree makes it a rare standout—delivering both performance and peace of mind.

    How LivFresh Dental Gel Works

    According to official website, The power of LivFresh Dental Gel lies in its ability to disrupt plaque at its earliest stage. Most toothpaste cleans reactively—scrubbing off existing buildup. LivFresh, on the other hand, prevents plaque from adhering to the teeth in the first place. It does this by neutralizing the electrostatic forces that allow proteins and bacteria to stick to enamel surfaces.

    This mechanism targets the formation of oral biofilm, a key contributor to gingivitis and tooth decay. By halting this process before it starts, LivFresh reduces the bacteria load in the mouth while maintaining a healthy oral pH. The result is a cleaner mouth that stays fresh longer after each brushing session.

    Unlike traditional pastes, LivFresh has a smooth gel consistency that coats the teeth more effectively, delivering consistent coverage and longer-lasting protection. It doesn’t foam unnecessarily, making it ideal for users with braces or implants.

    The brushing experience is noticeably different—more like a protective treatment than a quick rinse. And that’s the point: LivFresh is designed not just to clean teeth, but to create a cleaner oral environment altogether. It’s preventive science in a tube.

    LivFresh in the Media

    As LivFresh gains popularity, its presence across media channels continues to grow. From dental trade journals to mainstream outlets, LivFresh is being recognized for its scientific integrity and consumer-driven design. The brand has been featured in publications such as Dentistry Today, Oral Health & Prevention, and Modern Wellness Review, often spotlighted for its innovation in plaque control.

    Television segments and online health programs have also featured LivFresh, focusing on its appeal to users with gum sensitivity or post-procedure dental care. Influencers in the dental health space on TikTok and YouTube have praised the product, comparing it to professional cleanings—and showing real-time results.

    The brand has also made appearances in medical blogs, where it’s described as “one of the few oral care products that bridges the gap between clinical research and everyday use.” LivFresh’s scientific studies have been cited by professionals and discussed at dental symposia.

    Its rapid media traction is not the result of a massive advertising push—but rather, a ripple effect from scientific credibility and real user outcomes. In 2025, LivFresh is no longer a niche product—it’s a media-recognized player in the future of oral care.

    Visit Official Website To get More Information

    Daily Use, Simple Routine

    One of LivFresh’s biggest strengths lies in its simplicity. There’s no learning curve, no complicated dosing, and no prep time. Users simply brush twice daily with the gel—just as they would with any toothpaste. Yet the results are far beyond what traditional options offer.

    The smooth texture spreads easily across the enamel, reaching difficult areas without the foaming overload. It’s particularly useful for people with dental appliances, gum sensitivity, or those recovering from deep cleanings or procedures.

    The fresh, minty taste is clean without being overpowering, making it ideal for users of all ages. There’s no need for added rinses, special mouthwashes, or accompanying treatments. LivFresh integrates seamlessly into existing habits—whether you’re brushing in the morning rush or winding down at night.

    Users report a lasting clean feeling that extends hours past brushing. For people accustomed to brushing after every meal or coffee, LivFresh provides lasting freshness and less buildup throughout the day. In short, it delivers professional-grade results with everyday convenience. That’s a combination most oral care brands simply don’t offer.

    Eco-Conscious Innovation

    In a market increasingly driven by sustainability, LivFresh Dental Gel stands out not only for its science but also for its commitment to environmentally responsible practices. The brand has minimized the use of unnecessary packaging, opting for recyclable materials and reduced plastic where possible.

    The formula itself is free from harsh detergents, parabens, triclosan, and microbeads—ingredients commonly found in mainstream toothpaste that can harm aquatic ecosystems. LivFresh is also 100% cruelty-free, never tested on animals, and free from any animal-derived ingredients.

    In 2025, eco-conscious consumers are no longer satisfied with effectiveness alone. They want brands that align with their values. LivFresh has responded by building sustainability into its product and operations without compromising on clinical outcomes.

    From its low-impact manufacturing process to shipping practices aimed at reducing emissions, LivFresh is contributing to a cleaner mouth and a cleaner planet. For consumers balancing health with environmental responsibility, this dental gel offers both. It’s a step forward in oral care—without stepping backward on sustainability.

    Where to Buy LivFresh in 2025

    To ensure authenticity and optimal results, LivFresh recommends purchasing directly from its official website. This not only guarantees product integrity but also provides access to the brand’s subscription savings, trial kits, and periodic clinical updates. In 2025, online demand continues to rise, and LivFresh has scaled its logistics to offer fast, secure delivery across the U.S.

    While select dental offices may carry LivFresh, the company warns against purchasing from unauthorized third-party sellers on platforms like eBay or unofficial Amazon listings. Counterfeit and expired products can undermine the gel’s performance and safety profile.

    First-time buyers can often take advantage of bundled offers or risk-free guarantees on the official site, making it easy to try the product without commitment. LivFresh also offers customer support channels for brushing tips, subscription adjustments, and reordering reminders.

    In a category prone to overpromising and underdelivering, LivFresh prioritizes transparency, education, and safety from purchase to brushing. For those looking to experience clinically validated oral care from a trusted source, direct access remains the best and most reliable option.

    Closing Summary: Is LivFresh Worth Watching?

    As 2025 unfolds, LivFresh is proving that oral care can be both clinically advanced and consumer-friendly. With endorsements from dental professionals, strong results from published studies, and a growing fanbase of loyal users, LivFresh Dental Gel Toothpaste is no longer just an alternative—it’s a frontrunner.

    Its science-first formulation challenges the assumptions of what a toothpaste should do. By preventing plaque before it sticks and reducing inflammation without harsh ingredients, LivFresh brings real innovation to a space long dominated by outdated formulas.

    Consumers value its safety. Dentists value its efficacy. And the media is taking notice.

    For those seeking a smarter way to care for their teeth—without sacrificing simplicity or sustainability—LivFresh is more than just a dental gel. It’s a sign of where oral health is headed. And yes, it’s absolutely worth watching.

    For more information, educational content, and direct purchasing, visit the official LivFresh website.

    Company: LivFresh
    Email: info@getlivfresh.com
    Box 320928, Los Gatos, CA 95030,
    United States
    Website: https://www.healthysmiletoothpastepro.com/

    Disclaimer: The information provided in this article is for general educational and informational purposes only. It is not intended to serve as medical advice, diagnosis, or treatment. Always consult with a licensed healthcare provider before beginning any new supplement regimen, especially if you have a medical condition or are taking medication. Results with Erectin may vary from person to person based on individual health factors, adherence to recommended usage, and lifestyle variables. The content herein is not written or reviewed by a licensed medical professional. 

    No responsibility is assumed for any errors, omissions, or inaccuracies in the content, nor any consequences arising from the use of the information contained in this article. The publisher and its affiliates do not endorse or guarantee any product mentioned herein. All trademarks, service marks, and brand names mentioned are the property of their respective owners.

    Attachment

    The MIL Network

  • MIL-OSI Analysis: What Trump’s budget proposal says about his environmental values

    Source: The Conversation – USA – By Stan Meiburg, Executive Director, Sabin Center for Environment and Sustainability, Wake Forest University

    The president’s spending proposal doesn’t leave much behind. Alexey Kravchuk/iStock / Getty Images Plus

    To understand the federal government’s true priorities, follow the money.

    After months of saying his administration is committed to clean air and water for Americans, President Donald Trump has proposed a detailed budget for the U.S. Environmental Protection Agency for fiscal year 2026. The proposal is more consistent with his administration’s numerous recent actions and announcements that reduce protection for public health and the environment.

    To us, former EPA leaders – one a longtime career employee and the other a political appointee – the budget proposal reveals a lot about what Trump and EPA Administrator Lee Zeldin want to accomplish.

    According to the administration’s Budget in Brief document, total EPA funding for the fiscal year beginning October 2025 would drop from US$9.14 billion to $4.16 billion – a 54% decrease from the budget enacted by Congress for fiscal 2025 and less than half of EPA’s budget in any year of the first Trump administration.

    Without taking inflation into account, this would be the smallest EPA budget since 1986. Adjusted for inflation, it would be the smallest budget since the Ford administration, even though Congress has for decades given EPA more responsibility to clean up and protect the nation’s air and water; handle hazardous chemicals and waste; protect drinking water; clean up environmental contamination; and evaluate the safety of a wide range of chemicals used in commerce and industry. These expansions reflected a bipartisan consensus that protecting public health and the environment is a national priority.

    The budget process in brief

    Federal budgeting is complicated, and EPA’s budget is particularly so. Here are some basics:

    Each year, the president and Congress determine how much money will be spent on what things, and by which agencies. The familiar aphorism that “the president proposes, Congress disposes” captures the Constitution’s process for the federal budget, with Congress firmly holding the “power of the purse.”

    EPA’s budget can be difficult to understand because individual programs may be funded from different sources. It is useful to consider it as a pie sliced into five main pieces:

    • Environmental programs and management: the day-to-day work of protecting air, water and land.
    • Science and technology: research on pollution, health effects and new environmental tools.
    • Superfund and trust funds: cleaning up contaminated sites and responding to emergency releases of pollution.
    • State and Tribal operating grants: supporting local implementation of environmental laws.
    • State capitalization grants: revolving loans for water infrastructure.

    The Trump administration’s budget proposals for EPA represent a striking retreat from the national goals of clean air and clean water enacted in federal laws over the past 55 years. In the budget document, the administration argues that the federal government has done enough and that the protection of gains already achieved, as well as any further progress, should not be paid for with federal money.

    This budget would reduce EPA’s ability to protect public health and the environment to a bare minimum at best. Most dramatic and, in our view, most significant are the elimination of operating grants to state governments, drastic reductions in funding for science of all kinds, and elimination of EPA programs relating to climate change and environmental justice, which addresses situations of disproportionate environmental harm to vulnerable populations. It would cut regulatory and enforcement activities that the administration sees as inconsistent with fossil energy development. Other proposed changes, notably for Superfund and capitalization grants, are more nuanced.

    These changes to EPA’s regular budget allocation are separate from changes to supplementary EPA funding that have also been in the news, including for projects specified in the Inflation Reduction Act and other specific laws.

    Environmental programs and management

    Funding for basic work to protect the environment and prevent pollution would be cut by 22%. The reductions are not spread equally, however. All activities related to climate change would be eliminated, including the Energy Star program and greenhouse gas reporting and tracking. Funding for civil and criminal enforcement of environmental laws and regulations would be cut by 69% and 50%, respectively.

    The popular Brownfields program would be cut by 50%. Since 1995, $2.9 billion in federal funds have produced public and private investments totaling $42 billion for cleaning and redeveloping contaminated sites, and created more than 200,000 jobs.

    A program to set standards and conduct training for safe removal of lead paint and other lead-containing materials from homes and businesses would be eliminated.

    The administration has been clear that EPA will no longer do environmental justice work, such as funding to monitor toxic air emissions in low-income neighborhoods adjacent to industrial areas. This budget is consistent with that.

    Science and technology

    Scientific support functions would be cut by 34%. The Office of Research and Development would go from about 1,500 staff to about 500 and would be redistributed throughout the agency. This would diminish science that supports not just EPA’s work but that of organizations, industries, health care professionals and public and private researchers who benefit from EPA’s research.

    A former uranium mill in Colorado is just one of the nation’s extremely contaminated Superfund sites awaiting federal money for cleanup.
    RJ Sangosti/MediaNews Group/The Denver Post via Getty Images

    Superfund and other trust funds

    Superfund is by far the largest of EPA’s cleanup trust funds. It allows EPA to clean up contaminated sites. It also forces the parties responsible for the contamination to either perform cleanups or reimburse the government for EPA-led cleanup work. When there is no viable responsible party, Superfund gives EPA the funds and authority to clean up contaminated sites.

    Prior to 2021, Superfund was funded through EPA’s annual budget. In 2021 and 2022, Congress restored taxes on selected chemicals and petroleum products to help pay for Superfund. During the Biden administration, EPA reduced the Superfund’s line in the general budget, with the expectation that the Superfund tax revenues would more than make up for the reduction. Administrator Zeldin, who has said that site cleanup is a priority, is proposing to shift virtually all funding for cleanups to these new tax revenues.

    There is risk in this approach, however. The Superfund tax expires in 2031 and has raised less than Treasury Department predictions in both 2023 and 2024. In fiscal year 2024, available tax receipts were predicted to be $2.5 billion, but only $1.4 billion was collected. Future funding is uncertain because it depends on the amounts of various chemicals that companies actually use. Experts disagree on whether this is significant for the Superfund program. The petrochemical industry, on whom this tax largely falls, is lobbying for its repeal.

    Funds to address leaks at gas station tanks would be cut nearly in half. Funds to clean up oil and petroleum spills would be cut by 24%.

    State operating grants

    The budget proposal seeks to reset the EPA’s relationship with state agencies, which implement the vast majority of environmental regulations.

    EPA has long delegated some of its powers to state environmental agencies, including permitting, inspections and enforcement of regulations that govern air, water and soil pollution. Since the 1970s, EPA has helped fund those activities through basic operating grants that require minimum state contributions and reward larger state investments with additional federal dollars.

    The proposed budget would eliminate all of those grants to states – totaling $1 billion. The document itself explains that federal funding over decades has totaled “hundreds of billions of dollars” and has resulted in programs that “are mature or have accomplished their purpose.”

    States disagree. They note that EPA has delegated 90% of the nation’s environmental protection work to state authorities, and states have accepted that workload based on the expectation of federal funding. The states say reduced funding would greatly diminish the actual work of environmental protection – site inspections, air and water monitoring, and enforcement – across the country.

    State capitalization grants

    Since 1987, EPA has given states money for revolving loan programs that provide low-interest loans to state and local governments to clean up waterways and provide safe drinking water. The proposed budget would cut that funding by 89%, from $2.8 billion to $305 million.

    These capitalization grants were originally envisioned as seed money, with future loans available as the initial and subsequent loans were repaid. But the need for water infrastructure continues to grow, and Congress has for many years allocated additional money to the program.

    In protecting the environment, you get what you pay for. In past years, Congress has refused to accept proposed drastic cuts to EPA’s budget. It remains to be seen whether this Congress will go along with these proposed rollbacks.

    Stan Meiburg is a volunteer with the Environmental Protection Network. He was an employee of the Environmental Protection Agency from 1977 to 2017.

    i have worked at the US EPA twice. During the Obama Administration, i was first principal deputy to the Assistant Administrator of the Office of Air and Radiation and then Acting Assistant Administrator. During the Biden Administration, I was Deputy Administrator. I am also a volunteer with the Environmental Protection Network.

    ref. What Trump’s budget proposal says about his environmental values – https://theconversation.com/what-trumps-budget-proposal-says-about-his-environmental-values-258962

    MIL OSI Analysis

  • MIL-OSI Analysis: Cyberattacks shake voters’ trust in elections, regardless of party

    Source: The Conversation – USA – By Ryan Shandler, Professor of Cybersecurity and International Relations, Georgia Institute of Technology

    An election worker installs a touchscreen voting machine. Ethan Miller/Getty Images

    American democracy runs on trust, and that trust is cracking.

    Nearly half of Americans, both Democrats and Republicans, question whether elections are conducted fairly. Some voters accept election results only when their side wins. The problem isn’t just political polarization – it’s a creeping erosion of trust in the machinery of democracy itself.

    Commentators blame ideological tribalism, misinformation campaigns and partisan echo chambers for this crisis of trust. But these explanations miss a critical piece of the puzzle: a growing unease with the digital infrastructure that now underpins nearly every aspect of how Americans vote.

    The digital transformation of American elections has been swift and sweeping. Just two decades ago, most people voted using mechanical levers or punch cards. Today, over 95% of ballots are counted electronically. Digital systems have replaced poll books, taken over voter identity verification processes and are integrated into registration, counting, auditing and voting systems.

    This technological leap has made voting more accessible and efficient, and sometimes more secure. But these new systems are also more complex. And that complexity plays into the hands of those looking to undermine democracy.

    In recent years, authoritarian regimes have refined a chillingly effective strategy to chip away at Americans’ faith in democracy by relentlessly sowing doubt about the tools U.S. states use to conduct elections. It’s a sustained campaign to fracture civic faith and make Americans believe that democracy is rigged, especially when their side loses.

    This is not cyberwar in the traditional sense. There’s no evidence that anyone has managed to break into voting machines and alter votes. But cyberattacks on election systems don’t need to succeed to have an effect. Even a single failed intrusion, magnified by sensational headlines and political echo chambers, is enough to shake public trust. By feeding into existing anxiety about the complexity and opacity of digital systems, adversaries create fertile ground for disinformation and conspiracy theories.

    Just before the 2024 presidential election, Director of the Cybersecurity and Infrastructure Security Agency Jen Easterly explains how foreign influence campaigns erode trust in U.S. elections.

    Testing cyber fears

    To test this dynamic, we launched a study to uncover precisely how cyberattacks corroded trust in the vote during the 2024 U.S. presidential race. We surveyed more than 3,000 voters before and after election day, testing them using a series of fictional but highly realistic breaking news reports depicting cyberattacks against critical infrastructure. We randomly assigned participants to watch different types of news reports: some depicting cyberattacks on election systems, others on unrelated infrastructure such as the power grid, and a third, neutral control group.

    The results, which are under peer review, were both striking and sobering. Mere exposure to reports of cyberattacks undermined trust in the electoral process – regardless of partisanship. Voters who supported the losing candidate experienced the greatest drop in trust, with two-thirds of Democratic voters showing heightened skepticism toward the election results.

    But winners too showed diminished confidence. Even though most Republican voters, buoyed by their victory, accepted the overall security of the election, the majority of those who viewed news reports about cyberattacks remained suspicious.

    The attacks didn’t even have to be related to the election. Even cyberattacks against critical infrastructure such as utilities had spillover effects. Voters seemed to extrapolate: “If the power grid can be hacked, why should I believe that voting machines are secure?”

    Strikingly, voters who used digital machines to cast their ballots were the most rattled. For this group of people, belief in the accuracy of the vote count fell by nearly twice as much as that of voters who cast their ballots by mail and who didn’t use any technology. Their firsthand experience with the sorts of systems being portrayed as vulnerable personalized the threat.

    It’s not hard to see why. When you’ve just used a touchscreen to vote, and then you see a news report about a digital system being breached, the leap in logic isn’t far.

    Our data suggests that in a digital society, perceptions of trust – and distrust – are fluid, contagious and easily activated. The cyber domain isn’t just about networks and code. It’s also about emotions: fear, vulnerability and uncertainty.

    Firewall of trust

    Does this mean we should scrap electronic voting machines? Not necessarily.

    Every election system, digital or analog, has flaws. And in many respects, today’s high-tech systems have solved the problems of the past with voter-verifiable paper ballots. Modern voting machines reduce human error, increase accessibility and speed up the vote count. No one misses the hanging chads of 2000.

    But technology, no matter how advanced, cannot instill legitimacy on its own. It must be paired with something harder to code: public trust. In an environment where foreign adversaries amplify every flaw, cyberattacks can trigger spirals of suspicion. It is no longer enough for elections to be secure − voters must also perceive them to be secure.

    That’s why public education surrounding elections is now as vital to election security as firewalls and encrypted networks. It’s vital that voters understand how elections are run, how they’re protected and how failures are caught and corrected. Election officials, civil society groups and researchers can teach how audits work, host open-source verification demonstrations and ensure that high-tech electoral processes are comprehensible to voters.

    We believe this is an essential investment in democratic resilience. But it needs to be proactive, not reactive. By the time the doubt takes hold, it’s already too late.

    Just as crucially, we are convinced that it’s time to rethink the very nature of cyber threats. People often imagine them in military terms. But that framework misses the true power of these threats. The danger of cyberattacks is not only that they can destroy infrastructure or steal classified secrets, but that they chip away at societal cohesion, sow anxiety and fray citizens’ confidence in democratic institutions. These attacks erode the very idea of truth itself by making people doubt that anything can be trusted.

    If trust is the target, then we believe that elected officials should start to treat trust as a national asset: something to be built, renewed and defended. Because in the end, elections aren’t just about votes being counted – they’re about people believing that those votes count.

    And in that belief lies the true firewall of democracy.

    Anthony DeMattee receives funding from National Science Foundation and various academic institutions. He is the Data Scientist in the Democracy Program at The Carter Center.

    Bruce Schneier and Ryan Shandler do not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.

    ref. Cyberattacks shake voters’ trust in elections, regardless of party – https://theconversation.com/cyberattacks-shake-voters-trust-in-elections-regardless-of-party-259368

    MIL OSI Analysis

  • MIL-OSI Analysis: Cascading disasters like those created by Hurricane Helene show why hazard models can’t rely on the past

    Source: The Conversation – USA – By Brian J. Yanites, Associate Professor of Earth and Atmospheric Science. Professor of Surficial and Sedimentary Geology, Indiana University

    The Carter Lodge hangs precariously over the flood-scoured bank of the Broad River in Chimney Rock Village, N.C., on May 13, 2025, eight months after Hurricane Helene. AP Photo/Allen G. Breed

    Hurricane Helene lasted only a few days in September 2024, but it altered the landscape of the Southeastern U.S. in profound ways that will affect the hazards local residents face far into the future.

    Mudslides buried roads and reshaped river channels. Uprooted trees left soil on hillslopes exposed to the elements. Sediment that washed into rivers changed how water flows through the landscape, leaving some areas more prone to flooding and erosion.

    Helene was a powerful reminder that natural hazards don’t disappear when the skies clear – they evolve.

    These transformations are part of what scientists call cascading hazards. They occur when one natural event alters the landscape in ways that lead to future hazards. A landslide triggered by a storm might clog a river, leading to downstream flooding months or years later. A wildfire can alter the soil and vegetation, setting the stage for debris flows with the next rainstorm.

    Satellite images before (top) and after Hurricane Helene (bottom) show how the storm altered landscape near Pensacola, N.C., in the Blue Ridge Mountains.
    Google Earth, CC BY

    I study these disasters as a geomorphologist. In a new paper in the journal Science, I and a team of scientists from 18 universities and the U.S. Geological Survey explain why hazard models – used to help communities prepare for disasters – can’t just rely on the past. Instead, they need to be nimble enough to forecast how hazards evolve in real time.

    The science behind cascading hazards

    Cascading hazards aren’t random. They emerge from physical processes that operate continuously across the landscape – sediment movement, weathering, erosion. Together, the atmosphere, biosphere and the earth are constantly reshaping the conditions that cause natural disasters.

    For instance, earthquakes fracture rock and shake loose soil. Even if landslides don’t occur during the quake itself, the ground may be weakened, leaving it primed for failure during later rainstorms.

    That’s exactly what happened after the 2008 earthquake in Sichuan Province, China, which led to a surge in debris flows long after the initial seismic event.

    A strong aftershock after a 7.8 magnitude earthquake in Sichuan province, China, in May 2008 triggered more landslides in central China.
    AP Photo/Andy Wong

    Earth’s surface retains a “memory” of these events. Sediment disturbed in an earthquake, wildfire or severe storm will move downslope over years or even decades, reshaping the landscape as it goes.

    The 1950 Assam earthquake in India is a striking example: It triggered thousands of landslides. The sediment from these landslides gradually moved through the river system, eventually causing flooding and changing river channels in Bangladesh some 20 years later.

    An intensifying threat in a changing world

    These risks present challenges for everything from emergency planning to home insurance. After repeated wildfire-mudslide combinations in California, some insurers pulled out of the state entirely, citing mounting risks and rising costs among the reasons.

    Cascading hazards are not new, but their impact is intensifying.

    Climate change is increasing the frequency and severity of wildfires, storms and extreme rainfall. At the same time, urban development continues to expand into steep, hazard-prone terrain, exposing more people and infrastructure to evolving risks.

    The rising risk of interconnected climate disasters like these is overwhelming systems built for isolated events.

    Yet climate change is only part of the equation. Earth processes – such as earthquakes and volcanic eruptions – also trigger cascading hazards, often with long-lasting effects.

    Mount St. Helens is a powerful example: More than four decades after its eruption in 1980, the U.S. Army Corps of Engineers continues to manage ash and sediment from the eruption to keep it from filling river channels in ways that could increase the flood risk in downstream communities.

    Rethinking risk and building resilience

    Traditionally, insurance companies and disaster managers have estimated hazard risk by looking at past events.

    But when the landscape has changed, the past may no longer be a reliable guide to the future. To address this, computer models based on the physics of how these events work are needed to help forecast hazard evolution in real time, much like weather models update with new atmospheric data.

    A March 2024 landslide in the Oregon Coast Range wiped out trees in its path.
    Brian Yanites, June 2025
    A drone image of the same March 2024 landslide in the Oregon Coast Range shows where it temporarily dammed the river below.
    Brian Yanites, June 2025

    Thanks to advances in Earth observation technology, such as satellite imagery, drone and lidar, which is similar to radar but uses light, scientists can now track how hillslopes, rivers and vegetation change after disasters. These observations can feed into geomorphic models that simulate how loosened sediment moves and where hazards are likely to emerge next.

    Researchers are already coupling weather forecasts with post-wildfire debris flow models. Other models simulate how sediment pulses travel through river networks.

    Cascading hazards reveal that Earth’s surface is not a passive backdrop, but an active, evolving system. Each event reshapes the stage for the next.

    Understanding these connections is critical for building resilience so communities can withstand future storms, earthquakes and the problems created by debris flows. Better forecasts can inform building codes, guide infrastructure design and improve how risk is priced and managed. They can help communities anticipate long-term threats and adapt before the next disaster strikes.

    Most importantly, they challenge everyone to think beyond the immediate aftermath of a disaster – and to recognize the slow, quiet transformations that build toward the next.

    Brian J. Yanites receives funding from the National Science Foundation.

    ref. Cascading disasters like those created by Hurricane Helene show why hazard models can’t rely on the past – https://theconversation.com/cascading-disasters-like-those-created-by-hurricane-helene-show-why-hazard-models-cant-rely-on-the-past-259502

    MIL OSI Analysis

  • MIL-OSI: Free IQ Test Online with Instant Results – Fast, Accurate & Free: QuickIQTest.org Launches Updated 2025 Free IQ Testing Service

    Source: GlobeNewswire (MIL-OSI)

    New York City, June 27, 2025 (GLOBE NEWSWIRE) — QuickIQTest.org launches free IQ test with Instant results and no registration required. The company is proud to announce the official release of its updated free IQ test online, offering instant results through a scientifically designed, user-friendly platform. The new version allows users worldwide to complete an accurate cognitive assessment in under 10 minutes, entirely free and without requiring any personal information or registration.

    ⇒ Take the Free IQ Test Online – No Delay, No Cost!

    QuickIQTest.org is a leading online resource for cognitive self-assessment. Focused on accessibility, scientific accuracy, and honest reporting, the platform has helped millions of users worldwide better understand their cognitive skills without fees, sign-ups, or invasive data practices.

    Already used by millions, QuickIQTest.org’s latest IQ testing tool provides a fast, accessible way to evaluate fluid intelligence, logical reasoning, numerical comprehension, and spatial pattern recognition. Whether accessed from a smartphone, tablet, or desktop, the test offers a seamless experience across devices.

    ⇒ Start Your Free IQ Test Online – Instant, Accurate Results!

    “We created this test to be practical, honest, and available to everyone without barriers,” said a spokesperson for QuickIQTest.org. “With the 2025 update, we’ve focused on delivering speed and scientific accuracy, without compromising user privacy or simplicity.”

    Unlike many free IQ test online service providers that rely on gimmicks or upsells, QuickIQTest.org delivers immediate IQ scores along with a basic breakdown of performance across core cognitive areas. For users seeking a deeper understanding, an optional advanced analysis provides further interpretation of results.

    ⇒ Start your Free IQ Test Online – Fast, Proven, Accurate!

    This release reinforces the platform’s mission to offer a credible, no-cost tool for individuals looking to understand their cognitive strengths and thinking style better. The test is ideal for:

    • Students exploring their learning profile
    • Educators seeking classroom-ready assessment tools
    • Professionals curious about their problem-solving abilities
    • Anyone interested in how they process and analyze information

    ⇒ Take a Free IQ Test with Instant Results on QuickIQTest.org

    Key Features of the Updated Free IQ Test at QuickIQTest.org:

    • ✅ 100% Free IQ Test Online
    • ✅ Instant Results with No Sign-Up Required
    • ✅ Mobile & Desktop Friendly
    • ✅ Scientifically Designed Questions
    • ✅ Basic and Advanced Score Interpretation Options
    • ✅ No Data Collection to View Results

    As global demand grows for free IQ tests with instant results, QuickIQTest.org sets itself apart by offering a transparent, science-based testing experience without distractions, ads, or misleading scoring tactics.

    ⇒ Try the Free IQ Test with Free Results at QuickIQTest.org

    What Is an IQ Test?

    IQ stands for Intelligence Quotient. It is a score derived from standardized tests to measure a person’s ability to reason, solve problems, and recognize patterns. An IQ score reflects how someone performs compared to others in their age group. The average IQ is typically set at 100, with most people scoring between 85 and 115.

    These tests are often used in academic and professional settings to assess cognitive performance. While they don’t measure creativity or emotional understanding, they are a common method for evaluating specific mental skills.

    ⇒ Get Instant Scores with This Trusted Free IQ Test Online

    The Purpose of IQ Testing

    The primary function of an IQ test is to evaluate how effectively a person processes information. These assessments focus on areas such as:

    • Logical reasoning
    • Pattern recognition
    • Numerical analysis
    • Visual-spatial awareness
    • Short-term memory

    IQ tests are used in education, research, career planning, and personal development. Many people also take them out of curiosity, looking for a clearer picture of how their thinking compares to others.

    ⇒ Free IQ Test with Results – Fast, Honest, No Registration!

    A Brief History of IQ Tests

    The first modern IQ test was created in France in the early 1900s. Psychologist Alfred Binet developed a method to identify students who needed additional academic support. Lewis Terman later adapted his system in the United States at Stanford University, resulting in the Stanford-Binet Intelligence Scale.

    This version introduced the idea of comparing mental age to actual age, which became the foundation for the IQ scoring model that is still in use today. Over the years, new tests have been developed to include broader types of reasoning and improved scoring accuracy.

    ⇒ Start the Free Online IQ Test at QuickIQTest.org Today!

    IQ Testing Today

    Modern IQ tests are often taken online. These tests typically use timed multiple-choice questions designed to measure core thinking abilities. Thanks to digital platforms, users can now take a free IQ test online and receive their score within minutes without needing an in-person appointment or long wait times.

    One widely used option is the test offered at QuickIQTest.org. It provides a science-based format that includes logical and visual tasks. No registration is required, and results are available immediately.

    IQ Tests for Children

    IQ testing is not limited to adults. There are versions specifically designed for children that use age-appropriate questions and scoring. A free IQ test for kids can help parents better understand how their child approaches problem-solving and which tasks they respond to most effectively. The test at QuickIQTest.org offers this option using the same standard of accuracy as the general version.

    ⇒ Try a Proven Free IQ Test Online – Instant Results Included!

    How Do Free IQ Test Online Work?

    Online IQ tests are built around structured tasks that aim to measure specific areas of cognitive function. The most common formats include:

    • Logic puzzles: Identify relationships between shapes or sequences
    • Visual reasoning: Complete or match patterns using spatial awareness
    • Numerical sequences: Find missing values or detect number patterns
    • Short-term memory tasks: Recall and manipulate sets of information

    These formats are designed to measure fluid reasoning and problem-solving ability under controlled conditions.

    The test questions are usually multiple-choice. Users are asked to select the correct answer based on the data presented. Each question is intended to be objective and free from cultural or language bias.

    ⇒ Take This Free IQ Test with Free Instant Results Now!

    Structure of the Test on QuickIQTest.org

    The online IQ test at QuickIQTest.org uses a clear layout and simple instructions. The test begins immediately after the user starts. No login or email is required to initiate or see the results.

    The structure includes a series of progressively challenging tasks. The sequence of questions has been designed by cognitive assessment professionals. All items are displayed one at a time to reduce distractions.

    Most users complete the test in 10 minutes or less. The design allows for fast processing without sacrificing test quality.

    ⇒ Take an Accurate IQ Test Free Online in Minutes!

    Timed vs. Untimed Tests

    QuickIQTest.org uses a timed model. Each question must be answered within a certain period. This helps measure how quickly a user can recognize patterns or solve problems. A consistent time limit also allows scores to be compared across various users.

    Some IQ tests allow unlimited time. While this format may reduce pressure, it can produce less reliable results. Users may perform better or worse depending on test-taking habits rather than cognitive processing speed when speed is not controlled.

    The timed approach used at QuickIQTest.org is consistent with most standardized IQ assessments. It helps provide a balanced view of both accuracy and pace.

    ⇒ Get Trusted Instant Results with a Free IQ Test Online

    Scoring and Instant Results

    Once the test is complete, scores are generated immediately. This is one of the main advantages of using an online IQ test that is free from QuickIQTest.org.

    The platform uses a proprietary scoring system based on established intelligence testing models. The user’s performance is compared to normative data to produce an IQ score. That score represents where a person falls relative to others in their age group.

    The user receives performance feedback across specific questions along with the overall result. This includes reasoning accuracy, speed, and problem type. No additional sign-up is needed to access these details.

    This format provides a quick and reliable assessment that reflects real cognitive ability for users looking for a free IQ test with instant results.

    ⇒ Take Your IQ Test Free with QuickIQTest.org – Fast & Accurate

    Benefits of Taking a Free IQ Test Online

    Immediate Access to Results

    One of the main reasons people choose to get a free IQ test online is speed. After completing the test on QuickIQTest.org, results are provided instantly. There is no waiting period, and users do not need to provide personal information to receive their scores. This makes the process efficient for anyone seeking immediate feedback.

    For those comparing options, a free accurate IQ test that delivers real-time scoring offers a practical solution. The structure used by QuickIQTest.org allows for quick test completion while maintaining consistency in how answers are evaluated.

    ⇒ Free IQ Test with Free Results – Reliable and No Cost!

    No Registration Required

    QuickIQTest.org does not require users to create an account or submit contact information. The test can be accessed directly from the homepage, and users receive their results immediately after completion.

    This approach appeals to individuals who prefer to keep testing private. There are no follow-up emails or prompts to share results. The focus remains on allowing users to measure their cognitive ability without added steps.

    ⇒ Take the Free IQ Test Online – No Credit Card Needed!

    Practical Use Cases

    IQ tests are used for a variety of reasons. Some take them to evaluate personal strengths. Others use the results to support academic or professional planning.

    • Students may use IQ scores to identify areas of strength or prepare for standardized testing.
    • Professionals may take a test to assess their problem-solving ability in preparation for interviews or advancement opportunities.
    • Parents may use a test to better understand their child’s learning style or reasoning skills.

    A free IQ test and results allow exploring these areas without cost or commitment.

    ⇒ QuickIQTest.org’s Free IQ Test with Instant Results Available

    Compatible Across Devices

    The test at QuickIQTest.org works on most devices, including smartphones, tablets, and laptops. The interface is built for responsive access, with no downloads required.

    This level of access makes it easy for users to complete the test when it is most convenient for them. Whether at home or on a break at work, the platform supports a flexible testing experience.

    Focused Format

    Online testing also removes some of the obstacles found in traditional assessments. There is no need to travel, schedule an appointment, or complete paperwork. Instead, the test is available anytime, and users can begin as soon as they are ready.

    Since the test follows a simple, focused layout, users are not distracted by unrelated content or advertising interruptions.

    ⇒ Free Online IQ Test – Trusted and Fast, Try Now

    Free vs. Paid Online IQ Tests

    What Free Tests Typically Provide

    A free IQ test online usually offers a short series of timed questions to give a basic overview of a person’s reasoning ability. These questions often cover visual patterns, logic, and numerical sequences. Many users try free versions to get a general sense of how they perform these tasks.

    At QuickIQTest.org, the free IQ test includes a full set of questions and provides an immediate score once the test is completed. Registering or entering an email is not required to see the result. The structure allows users to complete the assessment without delays or access issues.

    The free IQ test with free results includes a performance summary across different types of reasoning. While more detailed reporting is available through the platform’s paid option, the basic score is presented clearly and without restriction.

    ⇒ Get a Free IQ Test with Results – Easy and Quick

    How Paid Versions Compare

    Some users may choose to upgrade for a more detailed breakdown of their score. Paid versions typically offer extended insights into cognitive categories, including logic, spatial awareness, numerical reasoning, and timing accuracy.

    Paid IQ tests can also include downloadable reports, percentile rankings, and score interpretation guides. These may be helpful for individuals using their test results for academic or professional purposes.

    QuickIQTest.org offers this option, but it does not restrict the free version in a way that forces users to pay. The full test and core results remain free to access.

    ⇒ Take an IQ Test Free Online – No Waiting, Instant Results

    When Paid Upgrades Make Sense

    A paid test may be helpful in the following situations:

    • When a user needs a full cognitive profile for documentation or planning
    • When applying for certain academic programs or training institutions
    • When preparing for high-level job assessments that include aptitude testing

    The extended score analysis can provide more detailed insight than the basic version in these cases.

    For casual users or those looking to test their ability quickly, the free IQ test online free from QuickIQTest.org is often sufficient.

    ⇒ Reliable Free IQ Test Online with Free Instant Results!

    Warning Signs to Watch For

    While many free IQ test sites claim to offer value, not all follow transparent practices. Some common issues include:

    • Requiring payment before showing the score
    • Showing inflated results with no explanation of how the score was calculated
    • Redirecting users to unrelated offers or subscriptions
    • Requiring full personal information to unlock any results

    These signs suggest that the test is focused on data collection or marketing, not accurate scoring.

    QuickIQTest.org avoids these tactics by providing a free IQ test with free results that are accessible, clear, and independent of promotional pressure.

    ⇒ IQ Test Free Results – Take It Today for Proven Accuracy

    How to Prepare for a Free IQ Test Online

    One of the most effective ways to prepare for a free IQ test online is to become familiar with the questions you will likely encounter. These usually include:

    • Number sequences
    • Visual pattern recognition
    • Logical reasoning tasks
    • Short-term memory questions
    • Word problems or analogies

    Practicing similar formats can help reduce hesitation during the actual test. These question types can be found in logic puzzle books or educational apps. Reviewing examples in advance can help build confidence.

    ⇒ Try the IQ Test Free Online at QuickIQTest.org

    Set Up a Focused Environment

    Taking the test in a quiet and comfortable space can help reduce distractions. Before beginning, it’s recommended to:

    • Choose a time of day when you feel alert
    • Turn off phone notifications or close other browser tabs
    • Use headphones if background noise is a concern
    • Have a pen and paper nearby if you prefer to make notes

    The test at QuickIQTest.org is timed, so being prepared before starting allows you to focus on answering questions without interruptions.

    Manage Test Anxiety

    Some users may feel pressure when taking a timed test, especially if unfamiliar with the format. Keeping expectations realistic can reduce unnecessary stress.

    Here are a few basic strategies to manage test anxiety:

    • Take a few minutes to breathe deeply before starting
    • Remember that the score reflects performance at one moment, not overall intelligence
    • Stay focused on one question at a time
    • Move on if a question takes too long, and return to it later if possible

    Staying calm often leads to better performance than over-preparing or worrying about the outcome.

    ⇒ Fast & Accurate Free IQ Test with Free Results

    Rest and Mental Warmups

    Being well-rested can improve concentration and reduce mistakes. Try to get adequate sleep the night before and avoid taking the test when tired or distracted.

    Before starting the test, consider doing a short mental warmup. This could include:

    • Solving a few basic math problems
    • Looking at a sample logic puzzle
    • Reading a short article to get your mind working

    These steps help activate the thinking processes used during the test without causing fatigue.

    Use the Test as a Self-Check

    An IQ test is one way to observe how you approach problem-solving under time pressure. It does not require weeks of preparation. Reviewing question types and setting up a calm space can be enough for most users to feel ready.

    For those looking to take a free IQ test and free results, QuickIQTest.org offers a format that requires no registration and gives results immediately. You can take the IQ test for free and repeat it later to see how consistent your scores are over time.

    ⇒ Take a Proven Online IQ Test Free with Instant Feedback

    Most Accurate Free IQ Test Online With Instant Results in 2025


    What Accuracy Means in IQ Testing

    Accuracy in an IQ test refers to how well it measures the abilities it is designed to assess. A well-designed test should reflect actual reasoning skills, not test-taking tricks or memorized answers. This includes clear questions, controlled timing, and scoring models based on extensive sample data.

    Tests that adjust difficulty, apply consistent time limits, and avoid bias tend to produce more dependable scores. Randomized question order, structured answer formats, and logic-based scoring models help reduce user inconsistencies.

    ⇒ Free IQ Test Online with Proven Accuracy – Start Now

    Key Features That Contribute to Accuracy

    Several technical factors improve the reliability of an online IQ test:

    • Standardized scoring: Results are calibrated against age-based norms
    • Balanced question design: Covers a wide range of reasoning tasks
    • Time control: Limits help reduce inflated scores caused by prolonged thinking
    • Adaptive feedback: Some platforms tailor scores based on speed and accuracy

    These features help prevent results from being skewed by guessing, overthinking, or external interference.

    ⇒ Free IQ Test and Results – Take Yours Instantly

    Why QuickIQTest.org is Considered a Leading Option

    QuickIQTest.org is often recommended by users who want a reliable, fast, and unbiased cognitive test. It includes:

    • A fixed set of logic-based tasks
    • A consistent, timed format
    • Instant scoring based on data models that reflect a broad user base
    • No registration or user tracking required to access the results

    Each score uses established test theory principles modeled after long-standing IQ frameworks in education and psychology. This positions the site as a strong option for users seeking the most accurate IQ test available for free.

    The test has been used by students, professionals, and teachers across different fields. Many have cited its simplicity and fairness as reasons they recommend it to others.

    ⇒ Try a Free IQ Test with Instant Results – Trusted Platform

    Understanding Your IQ Score

    How Scores Are Calculated

    An IQ score is a number used to indicate how a person performed on a structured reasoning ability test compared to others in the same age group. Most modern IQ tests use a scale where the average score is 100.

    This number does not change significantly between test platforms that follow recognized standards. A proper scoring system compares individual results to a large sample of test-takers. Scores are then grouped into categories for interpretation.

    General Score Ranges

    IQ scores are often organized into bands that reflect different types of performance. While specific labels can vary by test, the general breakdown is as follows:

    • Below 85: Below average
    • 85 to 99: Low average
    • 100 to 114: Average range
    • 115 to 129: Above average
    • 130 and above: High ability or gifted range

    QuickIQTest.org uses this common structure to present scores clearly. After finishing the test, users receive their number score and an explanation of the performance range.

    ⇒ Start the Trusted Free IQ Test Online – QuickIQTest.org

    What the Score Means

    The score reflects how well the user completed reasoning tasks under controlled conditions. It does not measure creativity, motivation, knowledge, or communication skills. It is also not a prediction of future success. Instead, it shows how someone performed on a specific type of problem-solving under time pressure.

    For this reason, scores may change slightly between attempts, especially if a user is more familiar with the format the second time. Consistency across attempts is one way to check for reliability.

    How QuickIQTest.org Presents Results

    After completing the free IQ test with results, users are shown:

    • Their overall IQ score
    • A basic breakdown of task types (e.g., logic, patterns, numbers)
    • A brief interpretation of the score range

    There is no requirement to register to access these details. The feedback is available immediately after completing the test.

    The platform also allows users to try the test again at any time. This can help track progress or compare scores across different days or devices.

    ⇒ Get Your IQ Test Free Results at QuickIQTest.org

    Free IQ Tests for Kids, Teens, and Adults

    Why Age Matters in IQ Testing

    IQ scores reflect how a person performs with others in the same age group. This is why age is a key factor in the test format and the interpretation of results. A test that does not account for age can produce misleading outcomes.

    Children, teens, and adults often process information differently. The types of tasks they handle best and the time needed to complete them may vary. An accurate IQ test should use a scoring model that adjusts for these differences.

    Use Cases by Age Group

    IQ tests are used for different purposes depending on the age of the person taking them:

    • Kids: Educators and parents may use test results to understand learning strengths or to determine if further evaluation is needed for gifted programs or academic support.
    • Teens: Testing during high school can help students explore their problem-solving skills before choosing courses or college plans.
    • Adults: Some adults take IQ tests to assess their general reasoning ability or prepare for assessments in employment or training programs.

    Each group benefits from a structured test and scores according to typical performance ranges for that age.

    ⇒ Take an Online IQ Test Free and Get Fast, Honest Scores

    How QuickIQTest.org Accommodates All Age Groups

    QuickIQTest.org offers a consistent testing format appropriate for a wide range of users. The questions reflect general reasoning ability without relying on specific academic knowledge. This makes the test suitable for children, teens, and adults.

    For younger users, a version of the free IQ test for kids includes simpler language and visual-based questions. The scoring is adjusted to reflect developmental benchmarks rather than adult standards. Parents can use this version to get a general understanding of their child’s reasoning style.

    Teens and adults use the standard version, which includes a broader mix of logic, pattern, and number-based tasks. All users receive a clear score at the end of the test, and the process remains the same across devices.

    QuickIQTest.org provides an IQ test online free of charge with no registration needed. The test is timed, results are given instantly, and users can retake it to check consistency.

    Whether used in a school setting, at home, or during career planning, the platform supports access for users at different stages of learning and development.

    ⇒ Free IQ Test with Free Results – No Sign Up Needed!

    Is a Free Online IQ Test Legitimate?

    Traditional IQ assessments are often administered by licensed psychologists in a controlled setting. These tests may take one to two hours, involve verbal interviews, and include subtests scored manually. They are sometimes used in academic placement, psychological evaluations, or legal matters.

    Online IQ tests, by comparison, are self-directed and usually shorter. While they do not replace a full clinical evaluation, they can still offer valid feedback when built on recognized test design principles.

    The key difference is scope. A clinical test may assess more variables. An online test focuses on speed, logic, and pattern-based reasoning.

    ⇒ Take the Reliable Free IQ Test Online – Instant Results

    What Makes a Free Online IQ Test Credible

    A legitimate online IQ test follows specific practices that support fair scoring and user trust. These include:

    • Standardized test structure
    • Clear question formatting
    • Time limits for each section
    • Results based on sample population scoring models
    • No requirement to pay or register to access scores

    A site that uses inconsistent timing, does not explain scores, or inflates results without justification should be viewed cautiously.

    A real IQ test online also avoids advertising pressure or unrelated offers during test-taking. If a user is constantly redirected or asked for irrelevant information, it may not be a trustworthy source.

    ⇒ Try an Accurate IQ Test Free with Instant Results

    Why QuickIQTest.org Is Considered Reliable

    QuickIQTest.org is a legit and accurate free IQ test online used by individuals seeking a fast, structured way to evaluate basic reasoning ability. It uses a fixed test format, applies a consistent time frame for each user, and presents results immediately after the test.

    The scoring model is based on standard cognitive testing practices. The test includes logical reasoning, visual sequencing, and numeric analysis, all scored against a baseline designed to reflect average performance ranges.

    Users do not need to sign up or pay to access their scores. The platform does not collect personal data in exchange for results. This approach supports transparency and reduces barriers to testing.

    QuickIQTest.org has been used by students, job applicants, educators, and others looking to check cognitive problem-solving ability quickly. While it is not a substitute for a full clinical exam, it provides accurate feedback in a short format.

    ⇒ Get Proven Results with a Free Online IQ Test

    Conclusion

    QuickIQTest.org offers a free IQ test with instant results that is accurate, fast, and easy to access. The test follows a standardized structure, uses time-based scoring, and does not require registration. Users of all ages can complete the test on any device and receive immediate results backed by a real scoring model.

    Whether you’re looking for a legit IQ test online, a free accurate IQ test, or a iq test free online for kids, the platform provides a reliable option without unnecessary steps.

    FAQs

    What is the best free IQ test online with instant results?

    If you’re looking for a free IQ test with instant results, QuickIQTest.org offers one of the most trusted and scientifically designed options. It delivers fast scoring and immediate feedback without requiring registration.

    Can I really get an accurate IQ test for free?

    Yes, many platforms now offer an IQ test for free that is both reliable and informative. QuickIQTest.org provides a free IQ test online that evaluates core cognitive skills with a validated scoring model.

    Is there an IQ test free online with no registration needed?

    Absolutely. QuickIQTest.org offers a free IQ test online free of sign-ups. You can take the test instantly and receive results in under 10 minutes without creating an account or providing personal data.

    Where can I find a free IQ test with free results?

    You can access a free IQ test with free results at QuickIQTest.org. It’s completely free to take, and your score breakdown is available right after completion—no hidden fees or upsells.

    How long does it take to complete a free online IQ test?

    Most free online IQ test options take between 8 to 12 minutes. The one on QuickIQTest.org is designed to be efficient and accurate, offering a fast way to test your intelligence from any device.

    Do free IQ tests provide reliable results?

    While not all IQ test online free platforms are created equal, some like QuickIQTest.org use standardized question patterns and deliver credible results. It’s a free IQ test with results that reflect core aspects of intelligence.

    Are free IQ tests suitable for students and professionals?

    Yes, a high-quality free IQ test with results can be valuable for students, professionals, and anyone curious about their cognitive abilities. It helps identify strengths in logic, reasoning, and problem-solving.

    Can I take a free IQ test on my phone or tablet?

    Definitely. QuickIQTest.org offers a free IQ test online that’s fully optimized for mobile and desktop. You can complete the test on any device without downloading anything.

    What’s included in a free IQ test with instant results?

    typical free IQ test with instant results includes a score summary and performance breakdown across areas like pattern recognition, logic, and numerical reasoning—all delivered immediately after the test.

    Is there a free IQ test and free results option with no hidden costs?

    Yes. Platforms like QuickIQTest.org provide a free IQ test and free results with no hidden fees or tricks. You can test your IQ and view your score instantly without entering payment information.

    Media Contact
    Company: Quick IQ Test
    Contact Person: Sean C. Bailey
    Email: support@quickiqtest.org
    Address: 3445 Canterbury Drive, New York, NY 10016, USA
    URL: https://quickiqtest.org/
    Phone: +1 646-598-0584
    Content Accuracy Disclaimer
    Every effort has been made to ensure the accuracy of the information presented in this article. However, due to the dynamic nature of product formulations, promotions, and availability, details may change without notice. The publisher makes no warranties or representations as to the current completeness or accuracy of any content, including product claims, pricing, or ingredient lists.
    It is the responsibility of the reader to verify product information directly through the official website or manufacturer prior to making a purchasing decision. Any reliance placed on the information in this article is done strictly at your own risk.
    Affiliate Disclosure
    This article may contain affiliate links. If you purchase a product or service through these links, the publisher may earn a commission at no additional cost to you. These commissions help support the creation of in-depth reviews and educational wellness content.
    The publisher only promotes products that have been independently evaluated and deemed potentially beneficial to readers. However, this compensation may influence the content, topics, or products discussed in this article. The views and opinions expressed are those of the author and do not necessarily reflect the official policy or position of any affiliate partner or product provider.
    All product reviews and descriptions reflect the author’s honest opinion based on available public data, user feedback, and scientific references at the time of writing. The inclusion of affiliate links does not influence the objectivity or integrity of the content. However, readers are encouraged to independently verify product information and consult with healthcare professionals prior to purchase or use.
    No warranties, either expressed or implied, are made about the completeness, accuracy, reliability, or suitability of the content provided. The publisher and all affiliated parties expressly disclaim any and all liability arising directly or indirectly from the use of any information contained herein.
    Product and Trademark Rights
    All product names, logos, and brands mentioned are the property of their respective owners. Use of these names does not imply endorsement unless explicitly stated. QuickIQTest.org® are the trademarks of its respective brand owner.

    Attachment

    The MIL Network

  • MIL-OSI Analysis: Toxic algae blooms are lasting longer in Lake Erie − why that’s a worry for people and pets

    Source: The Conversation – USA – By Gregory J. Dick, Professor of Biology, University of Michigan

    A satellite image from Aug. 13, 2024, shows an algal bloom covering approximately 320 square miles (830 square km) of Lake Erie. By Aug. 22, it had nearly doubled in size. NASA Earth Observatory

    Federal scientists released their annual forecast for Lake Erie’s harmful algal blooms on June 26, 2025, and they expect a mild to moderate season. However, anyone who comes in contact with the blooms can face health risks, and it’s worth remembering that 2014, when toxins from algae blooms contaminated the water supply in Toledo, Ohio, was considered a moderate year, too.

    We asked Gregory J. Dick, who leads the Cooperative Institute for Great Lakes Research, a federally funded center at the University of Michigan that studies harmful algal blooms among other Great Lakes issues, why they’re such a concern.

    The National Oceanic and Atmospheric Administration’s prediction for harmful algal bloom severity in Lake Erie compared with past years.
    NOAA

    1. What causes harmful algal blooms?

    Harmful algal blooms are dense patches of excessive algae growth that can occur in any type of water body, including ponds, reservoirs, rivers, lakes and oceans. When you see them in freshwater, you’re typically seeing cyanobacteria, also known as blue-green algae.

    These photosynthetic bacteria have inhabited our planet for billions of years. In fact, they were responsible for oxygenating Earth’s atmosphere, which enabled plant and animal life as we know it.

    The leading source of harmful algal blooms today is nutrient runoff from fertilized farm fields.
    Michigan Sea Grant

    Algae are natural components of ecosystems, but they cause trouble when they proliferate to high densities, creating what we call blooms.

    Harmful algal blooms form scums at the water surface and produce toxins that can harm ecosystems, water quality and human health. They have been reported in all 50 U.S. states, all five Great Lakes and nearly every country around the world. Blue-green algae blooms are becoming more common in inland waters.

    The main sources of harmful algal blooms are excess nutrients in the water, typically phosphorus and nitrogen.

    Historically, these excess nutrients mainly came from sewage and phosphorus-based detergents used in laundry machines and dishwashers that ended up in waterways. U.S. environmental laws in the early 1970s addressed this by requiring sewage treatment and banning phosphorus detergents, with spectacular success.

    How pollution affected Lake Erie in the 1960s, before clean water regulations.

    Today, agriculture is the main source of excess nutrients from chemical fertilizer or manure applied to farm fields to grow crops. Rainstorms wash these nutrients into streams and rivers that deliver them to lakes and coastal areas, where they fertilize algal blooms. In the U.S., most of these nutrients come from industrial-scale corn production, which is largely used as animal feed or to produce ethanol for gasoline.

    Climate change also exacerbates the problem in two ways. First, cyanobacteria grow faster at higher temperatures. Second, climate-driven increases in precipitation, especially large storms, cause more nutrient runoff that has led to record-setting blooms.

    2. What does your team’s DNA testing tell us about Lake Erie’s harmful algal blooms?

    Harmful algal blooms contain a mixture of cyanobacterial species that can produce an array of different toxins, many of which are still being discovered.

    When my colleagues and I recently sequenced DNA from Lake Erie water, we found new types of microcystins, the notorious toxins that were responsible for contaminating Toledo’s drinking water supply in 2014.

    These novel molecules cannot be detected with traditional methods and show some signs of causing toxicity, though further studies are needed to confirm their human health effects.

    Blue-green algae blooms in freshwater, like this one near Toledo in 2014, can be harmful to humans, causing gastrointestinal symptoms, headache, fever and skin irritation. They can be lethal for pets.
    Ty Wright for The Washington Post via Getty Images

    We also found organisms responsible for producing saxitoxin, a potent neurotoxin that is well known for causing paralytic shellfish poisoning on the Pacific Coast of North America and elsewhere.

    Saxitoxins have been detected at low concentrations in the Great Lakes for some time, but the recent discovery of hot spots of genes that make the toxin makes them an emerging concern.

    Our research suggests warmer water temperatures could boost its production, which raises concerns that saxitoxin will become more prevalent with climate change. However, the controls on toxin production are complex, and more research is needed to test this hypothesis. Federal monitoring programs are essential for tracking and understanding emerging threats.

    3. Should people worry about these blooms?

    Harmful algal blooms are unsightly and smelly, making them a concern for recreation, property values and businesses. They can disrupt food webs and harm aquatic life, though a recent study suggested that their effects on the Lake Erie food web so far are not substantial.

    But the biggest impact is from the toxins these algae produce that are harmful to humans and lethal to pets.

    The toxins can cause acute health problems such as gastrointestinal symptoms, headache, fever and skin irritation. Dogs can die from ingesting lake water with harmful algal blooms. Emerging science suggests that long-term exposure to harmful algal blooms, for example over months or years, can cause or exacerbate chronic respiratory, cardiovascular and gastrointestinal problems and may be linked to liver cancers, kidney disease and neurological issues.

    The water intake system for the city of Toledo, Ohio, is surrounded by an algae bloom in 2014. Toxic algae got into the water system, resulting in residents being warned not to touch or drink their tap water for three days.
    AP Photo/Haraz N. Ghanbari

    In addition to exposure through direct ingestion or skin contact, recent research also indicates that inhaling toxins that get into the air may harm health, raising concerns for coastal residents and boaters, but more research is needed to understand the risks.

    The Toledo drinking water crisis of 2014 illustrated the vast potential for algal blooms to cause harm in the Great Lakes. Toxins infiltrated the drinking water system and were detected in processed municipal water, resulting in a three-day “do not drink” advisory. The episode affected residents, hospitals and businesses, and it ultimately cost the city an estimated US$65 million.

    4. Blooms seem to be starting earlier in the year and lasting longer – why is that happening?

    Warmer waters are extending the duration of the blooms.

    In 2025, NOAA detected these toxins in Lake Erie on April 28, earlier than ever before. The 2022 bloom in Lake Erie persisted into November, which is rare if not unprecedented.

    Scientific studies of western Lake Erie show that the potential cyanobacterial growth rate has increased by up to 30% and the length of the bloom season has expanded by up to a month from 1995 to 2022, especially in warmer, shallow waters. These results are consistent with our understanding of cyanobacterial physiology: Blooms like it hot – cyanobacteria grow faster at higher temperatures.

    5. What can be done to reduce the likelihood of algal blooms in the future?

    The best and perhaps only hope of reducing the size and occurrence of harmful algal blooms is to reduce the amount of nutrients reaching the Great Lakes.

    In Lake Erie, where nutrients come primarily from agriculture, that means improving agricultural practices and restoring wetlands to reduce the amount of nutrients flowing off of farm fields and into the lake. Early indications suggest that Ohio’s H2Ohio program, which works with farmers to reduce runoff, is making some gains in this regard, but future funding for H2Ohio is uncertain.

    In places like Lake Superior, where harmful algal blooms appear to be driven by climate change, the solution likely requires halting and reversing the rapid human-driven increase in greenhouse gases in the atmosphere.

    Gregory J. Dick receives funding for harmful algal bloom research from the National Oceanic and Atmospheric Administration, the National Science Foundation, the United States Geological Survey, and the National Institutes for Health. He serves on the Science Advisory Council for the Environmental Law and Policy Center.

    ref. Toxic algae blooms are lasting longer in Lake Erie − why that’s a worry for people and pets – https://theconversation.com/toxic-algae-blooms-are-lasting-longer-in-lake-erie-why-thats-a-worry-for-people-and-pets-259954

    MIL OSI Analysis

  • MIL-OSI: Climb Channel Solutions Announces Distribution Partnership with Egnyte

    Source: GlobeNewswire (MIL-OSI)

    EATONTOWN, N.J., June 27, 2025 (GLOBE NEWSWIRE) — Climb Channel Solutions, an international specialty technology distributor and wholly owned subsidiary of Climb Global Solutions, Inc. (NASDAQ: CLMB) is proud to announce a distribution agreement with Egnyte, a leader in secure content collaboration, intelligence, and governance.

    This partnership enables Climb to deliver Egnyte’s cloud-native platform to partners and their customers across the United States, reinforcing Climb’s commitment to expanding access to transformative technologies worldwide. By adding Egnyte to its portfolio, Climb is equipping resellers with a trusted, scalable platform that fits seamlessly into both SMB and enterprise environments. This partnership underscores Climb’s mission to deliver partner-first technologies that move with the speed of modern business.

    “We are thrilled to announce Egnyte’s partnership with Climb Channel Solutions as we continue to invest deeply in the partner community,” said Bob Gagnon, Senior Vice President of Global Channel Sales at Egnyte. “Egnyte is committed to delivering high-quality, innovative solutions, and Climb Channel Solutions is uniquely positioned to add value to the distribution network with deep industry expertise, a strong track record of on-time delivery, and a collaborative approach tailored to regional and strategic objectives.”

    This partnership comes on the heels of Egnyte announcing enhancements to its Partner Program and new partner portal, Partner Hub, reflecting its commitment to delivering a more streamlined approach to better support a broader network of solution partners. Egnyte’s partner program is built upon its three core partnering priorities: profitability, enablement, and simplicity, to help our partners bring Egnyte’s AI-powered cloud collaboration platform to more businesses. Resellers will be able to take advantage of Egnyte’s agile supply chain support, responsive technical assistance, and competitive pricing to enable faster market penetration and sustained growth.

    “Egnyte is a standout addition to our vendor ecosystem,” said Dale Foster, CEO of Climb. “Their channel momentum, combined with a product that addresses real-time collaboration and secure file sharing, makes this a win for our partners. We’re excited to support Egnyte’s continued growth through Climb’s extensive reseller network and to help businesses leverage data more intelligently and securely. Together, we’re making enterprise-grade solutions more accessible.”

    Those interested in distribution services and solutions should contact Climb by phone at +1.800.847.7078 (US), or +1.888.523.7777 (Canada), or by email at Sales@ClimbCS.com.

    About Climb Channel Solutions and Climb Global Solutions

    Climb Channel Solutions is a global specialty technology distributor focused on Security, Data Management, Connectivity, Storage & HCI, Virtualization & Cloud, and Software & Application Lifecycle. What sets Climb apart is our commitment to reimagining distribution through a data-driven approach that brings emerging technologies to market faster. We empower our partners with speed to market, flexible financing, real-time quoting, best-of-breed channel operations, and exceptional service—transforming how distribution supports growth and scalability. Climb Channel Solutions is a wholly owned subsidiary of Climb Global Solutions (NASDAQ: CLMB). Experience distribution reimagined and discover how our people-first approach helps VARs and MSPs grow, scale, and accelerate their business. Visit www.ClimbCS.com, call 1-800-847-7078, and connect with us on LinkedIn!

    For Media & PR inquiries contact:
    Climb Channel Solutions
    Media Relations
    media@ClimbCS.com

    Investor Relations Contact:
    Elevate IR
    Sean Mansouri, CFA
    T: 720-330-2829
    CLMB@elevate-ir.com

    About Egnyte

    Egnyte combines the power of cloud content management, data security, and AI into one intelligent content platform. More than 22,000 customers trust Egnyte to improve employee productivity, automate business processes, and safeguard critical data, in addition to offering specialized content intelligence and automation solutions across industries, including architecture, engineering, and construction (AEC), life sciences, and financial services. For more information, visit www.egnyte.com.

    Media Contact:
    Erin Mancini
    Senior Manager of Public Relations
    media@egnyte.com

    The MIL Network

  • India’s electricity use may hit 4 trillion units in a decade: report

    Source: Government of India

    Source: Government of India (4)

    India’s electricity demand is projected to triple to a staggering 4 trillion units (TWh) by 2035, driven by industrial expansion, urbanisation, and the electrification of transport, according to a report released on Friday by OmniScience Capital.

    By 2035, three transformative sectors—electric vehicles (EVs), data centres (DCs), and railways—are expected to be among the largest consumers of electricity, collectively consuming around 500 TWh, or 12–13 per cent of India’s total projected power demand.

    This marks a pivotal shift in the country’s energy landscape, where traditional industrial and residential consumption is now being complemented by these emerging drivers.

    The report underscores the importance of India’s energy transition for a sustainable future. Policy initiatives such as the Net Zero target, the 500 GW renewable energy goal, EV adoption, and the rooftop solar push are playing a critical role in driving this transformation.

    India’s per capita electricity consumption is expected to nearly double—from 1,400 kWh in 2024 to 2,575 kWh by 2035—driven by rapid economic growth, urbanisation, and rising household incomes.

    “India’s electricity demand reaching four trillion units by 2035 is a signal of the country’s accelerating industrial growth, digital transformation, and rising quality of life,” said Ashwini Shami, Executive Vice President at OmniScience Capital. “This trend unlocks significant investment potential in energy infrastructure, renewable energy, and modernising the grid.”

    As more people migrate to cities and adopt energy-intensive appliances, and as industries expand under initiatives like Make in India, electricity consumption is set to increase significantly. The push for digital infrastructure, EVs, and rural electrification is expected to further drive this growth.

    The report also notes that the transition to cleaner and more accessible energy sources will make electricity more affordable and widely available, leading to increased consumption across all sectors.

    India’s commercial and service sectors are emerging as major engines of electricity demand. From 181 TWh in 2023, consumption in these segments is projected to rise to 798 TWh by 2035, marking a 4.4x increase and a compound annual growth rate (CAGR) of 13.2 per cent—the second-fastest among all sectors. This would raise their share to nearly 20 per cent of total electricity usage, reflecting India’s rapid shift toward a service-led, digitally connected economy.

    The transport sector—comprising EVs and railways—is projected to become the fastest-growing consumer of electricity, with usage expected to surge from 25 TWh in 2022 to 162 TWh by 2035, representing a CAGR of 16.8 per cent. The main drivers include accelerated EV adoption, the expansion of charging infrastructure, and railway electrification, the report added.

    —IANS

  • India’s electricity use may hit 4 trillion units in a decade: report

    Source: Government of India

    Source: Government of India (4)

    India’s electricity demand is projected to triple to a staggering 4 trillion units (TWh) by 2035, driven by industrial expansion, urbanisation, and the electrification of transport, according to a report released on Friday by OmniScience Capital.

    By 2035, three transformative sectors—electric vehicles (EVs), data centres (DCs), and railways—are expected to be among the largest consumers of electricity, collectively consuming around 500 TWh, or 12–13 per cent of India’s total projected power demand.

    This marks a pivotal shift in the country’s energy landscape, where traditional industrial and residential consumption is now being complemented by these emerging drivers.

    The report underscores the importance of India’s energy transition for a sustainable future. Policy initiatives such as the Net Zero target, the 500 GW renewable energy goal, EV adoption, and the rooftop solar push are playing a critical role in driving this transformation.

    India’s per capita electricity consumption is expected to nearly double—from 1,400 kWh in 2024 to 2,575 kWh by 2035—driven by rapid economic growth, urbanisation, and rising household incomes.

    “India’s electricity demand reaching four trillion units by 2035 is a signal of the country’s accelerating industrial growth, digital transformation, and rising quality of life,” said Ashwini Shami, Executive Vice President at OmniScience Capital. “This trend unlocks significant investment potential in energy infrastructure, renewable energy, and modernising the grid.”

    As more people migrate to cities and adopt energy-intensive appliances, and as industries expand under initiatives like Make in India, electricity consumption is set to increase significantly. The push for digital infrastructure, EVs, and rural electrification is expected to further drive this growth.

    The report also notes that the transition to cleaner and more accessible energy sources will make electricity more affordable and widely available, leading to increased consumption across all sectors.

    India’s commercial and service sectors are emerging as major engines of electricity demand. From 181 TWh in 2023, consumption in these segments is projected to rise to 798 TWh by 2035, marking a 4.4x increase and a compound annual growth rate (CAGR) of 13.2 per cent—the second-fastest among all sectors. This would raise their share to nearly 20 per cent of total electricity usage, reflecting India’s rapid shift toward a service-led, digitally connected economy.

    The transport sector—comprising EVs and railways—is projected to become the fastest-growing consumer of electricity, with usage expected to surge from 25 TWh in 2022 to 162 TWh by 2035, representing a CAGR of 16.8 per cent. The main drivers include accelerated EV adoption, the expansion of charging infrastructure, and railway electrification, the report added.

    —IANS

  • India’s electricity use may hit 4 trillion units in a decade: report

    Source: Government of India

    Source: Government of India (4)

    India’s electricity demand is projected to triple to a staggering 4 trillion units (TWh) by 2035, driven by industrial expansion, urbanisation, and the electrification of transport, according to a report released on Friday by OmniScience Capital.

    By 2035, three transformative sectors—electric vehicles (EVs), data centres (DCs), and railways—are expected to be among the largest consumers of electricity, collectively consuming around 500 TWh, or 12–13 per cent of India’s total projected power demand.

    This marks a pivotal shift in the country’s energy landscape, where traditional industrial and residential consumption is now being complemented by these emerging drivers.

    The report underscores the importance of India’s energy transition for a sustainable future. Policy initiatives such as the Net Zero target, the 500 GW renewable energy goal, EV adoption, and the rooftop solar push are playing a critical role in driving this transformation.

    India’s per capita electricity consumption is expected to nearly double—from 1,400 kWh in 2024 to 2,575 kWh by 2035—driven by rapid economic growth, urbanisation, and rising household incomes.

    “India’s electricity demand reaching four trillion units by 2035 is a signal of the country’s accelerating industrial growth, digital transformation, and rising quality of life,” said Ashwini Shami, Executive Vice President at OmniScience Capital. “This trend unlocks significant investment potential in energy infrastructure, renewable energy, and modernising the grid.”

    As more people migrate to cities and adopt energy-intensive appliances, and as industries expand under initiatives like Make in India, electricity consumption is set to increase significantly. The push for digital infrastructure, EVs, and rural electrification is expected to further drive this growth.

    The report also notes that the transition to cleaner and more accessible energy sources will make electricity more affordable and widely available, leading to increased consumption across all sectors.

    India’s commercial and service sectors are emerging as major engines of electricity demand. From 181 TWh in 2023, consumption in these segments is projected to rise to 798 TWh by 2035, marking a 4.4x increase and a compound annual growth rate (CAGR) of 13.2 per cent—the second-fastest among all sectors. This would raise their share to nearly 20 per cent of total electricity usage, reflecting India’s rapid shift toward a service-led, digitally connected economy.

    The transport sector—comprising EVs and railways—is projected to become the fastest-growing consumer of electricity, with usage expected to surge from 25 TWh in 2022 to 162 TWh by 2035, representing a CAGR of 16.8 per cent. The main drivers include accelerated EV adoption, the expansion of charging infrastructure, and railway electrification, the report added.

    —IANS

  • MIL-OSI Analysis: Poland, divided between Trump and the EU

    Source: The Conversation – France – By Jacques Rupnik, Directeur de recherche émérite, Centre de recherches internationales (CERI), Sciences Po

    Karol Nawrocki in the Oval Office with Donald Trump on May 25th 2025, ten days before the first round of the Polish presidential election. It is very rare for a sitting US president to receive a candidate in a foreign election.
    White House X account

    Nawrocki’s narrow victory (50.89%) over Trzaskowski, the mayor of Warsaw and candidate of the government coalition, illustrates and reinforces the political polarisation of Poland and the rise of the populist “Trumpist” right in Central and Eastern Europe. Since the start of the war in Ukraine, there has been much speculation about whether Europe’s geopolitical centre of gravity is shifting eastwards. The Polish election seems to confirm that the political centre of gravity is shifting to the right.

    A narrow victory

    We are witnessing a relative erosion of the duopoly of the two major parties, Civic Platform (PO) and Law and Justice (PiS), whose leaders – the current Prime Minister, Donald Tusk, and Jarosław Kaczyński respectively – have dominated the political landscape for over twenty years.

    Kaczyński’s skill lay in propelling a candidate with no responsibilities in his party, who was little known to the general public a few months ago, and, above all, who is from a different generation, to the presidency (a position held since 2015 by a PiS man, Andrzej Duda). Nawrocki, a historian by training and director of the Polish Institute of National Remembrance, has helped shape PiS’s memory policy. He won the second round, despite his troubled past as a hooligan, by appealing to voters on the right.

    In the first round, he won 29.5% of the vote, compared to Trzaskowski’s 31.36%, but the two far-right candidates, Sławomir Mentzen (an ultra-nationalist and economic libertarian) and Grzegorz Braun (a monarchist, avowed reactionary, and anti-Semite), won a total of 21% of the vote. They attracted a young electorate (60% of 18–29-year-olds), who overwhelmingly transferred their votes to Nawrocki in the second round.



    A weekly email in English featuring expertise from scholars and researchers. It provides an introduction to the diversity of research coming out of the continent and considers some of the key issues facing European countries. Get the newsletter!


    Despite a high turnout of 71% and favourable votes from the Polish diaspora (63%), Trzaskowski was unable to secure enough votes from the first-round candidates linked to the governing coalition, including those on the left (who won 10% between them) and the centre-right (Szymon Hołownia’s Third Way movement, which won 5% in the first round).

    A Tusk government struggling to implement its programme

    There are two Polands facing each other: the big cities, where incomes and levels of education are higher, and the more rural small towns, which are more conservative on social issues and more closely linked to the Catholic Church.
    The themes of nationhood – Nawrocki’s campaign slogan was “Poland first, Poles first” – family, and traditional values continue to resonate strongly with an electorate that has been loyal to PiS for more than twenty years. The electoral map, which shows a clear north-west/south-east divide, is similar to those of previous presidential elections and even echoes the partition of Poland at the end of the eighteenth century. The PiS vote is strongest in the part of the country that was under Russian rule until 1918. A more traditional Catholicism in these less developed regions, coupled with a strong sense of national identity, partly explains these historical factors.

    The economic explanation for the vote is unconvincing. Over the past 25 years, Poland has undergone tremendous transformation, driven by steady economic growth. GDP per capita has risen from 25% to 80% of the EU average, although this growth has been unevenly distributed. Nevertheless, a relatively generous welfare state has been preserved.

    Clearly, however, this growth, driven by investment from Western Europe (primarily Germany) and European structural funds (3% of GDP), does not provide a sufficient electoral base for a liberal, centrist, pro-European government.

    It is precisely the government’s performance that may hold the key to Trzaskowski’s failure. Having come to power at the end of 2023 with a reformist agenda, Donald Tusk’s government has only been able to implement part of its programme, and it is difficult to be the candidate of an unpopular government. Conversely, the governing coalition has been weakened by the failure of its candidate.

    The main reason for the stalling of reforms is the presidential deadlock. Although the president has limited powers, he countersigns laws and overriding his veto requires a three fifth majority in parliament, which the governing coalition lacks.

    The president also plays a role in foreign policy by representing the country, and above all by appointing judges, particularly to the Supreme Court. This has hindered the judicial reforms expected after eight years of PiS rule. It is mainly in this area that Duda has obstructed progress. The election of Nawrocki, who is known for his combative nature, suggests that the period of cohabitation will be turbulent.

    What are the main international implications of Nawrocki’s election?

    Donald Tusk is now more popular in Europe than in Poland; in this respect, we can speak of a “Gorbachev syndrome”. In Central Europe, the Visegrad Group (comprising Hungary, Poland, the Czech Republic, and Slovakia) is deeply divided by the war in Ukraine, but it could find common ground around a populist sovereignty led by Hungary’s Viktor Orbán. Orbán was the first to congratulate Nawrocki on his victory, followed by his Slovak neighbour Robert Fico. The Czech Republic could also see a leader from this movement come to power if Andrej Babiš wins the parliamentary elections this autumn. Nawrocki would fit right into this picture.

    Since Donald Tusk returned to power, particularly during Poland’s EU presidency, which ends on 30 June, the focus has been on Poland’s “return” to the heart of the European process. Against the backdrop of the war in Ukraine and Poland’s pivotal role in coordinating a European response, the Weimar Group (comprising Paris, Berlin, and Warsaw) has emerged as a key player. Three converging factors have made this possible: the French president’s firm stance toward Russia; the new German chancellor, Friedrich Merz, breaking a few taboos on defence and budgetary discipline; and Donald Tusk, the former president of the European Council, regaining a place at the heart of the EU that his predecessors had abandoned. A framework for a strategic Europe was taking shape.

    However, President Nawrocki, and the PiS more generally, are taking a different approach to the EU: they are positioning themselves as Eurosceptic opponents defending sovereignty. They are playing on anti-German sentiment by demanding reparations 80 years after the end of the Second World War and asserting Poland’s sovereignty in the face of a “Germany-dominated Europe”. The Weimar Triangle, recently strengthened by the bilateral treaty between France and Poland signed on 9 May 2025, could be weakened on the Polish–German flank.

    As a historian and former director of the Second World War Museum in Gdansk and the Institute of National Remembrance, Nawrocki is well placed to exploit this historical resentment. He has formulated a nationalist memory policy centred on a discourse of victimhood, portraying Poland as perpetually under attack from its historic enemies, Russia and Germany.

    While there is a broad consensus in Poland regarding the Russian threat, opinions differ regarding the government’s desire to separate the traumas of the past, particularly those of the last war, from the challenges of European integration today.

    Memory issues also play a prominent role in relations with Ukraine. There is total consensus on the need to provide military support to Ukraine, under attack: this is obvious in Poland, given its history and geography – defending Ukraine is inseparable from Polish security. However, both Nawrocki and Trzaskowski have touched upon the idea that Ukraine should apologise for the crimes committed by Ukrainian nationalists during the last war, starting with the massacre of more than 100,000 Poles in Volyn (Volhynia), north-western Ukraine) by Stepan Bandera’s troops.

    Alongside memory policy, Nawrocki and the PiS are calling for the abolition of the 800 zloty (190 euros) monthly allowance paid to Ukrainian refugees. Poland had more than one million Ukrainian workers prior to the war, and more than two million additional workers have arrived since it started, although around one million have since relocated to other countries, primarily Germany and the Czech Republic.

    Prior to the second round of the presidential election, Nawrocki readily signed the eight demands of the far-right candidate Sławomir Mentzen, which included ruling out Ukraine’s future NATO membership. Playing on anti-Ukrainian (and anti-German) sentiment, alongside Euroscepticism and sovereignty, is one of the essential elements of the new president’s nationalist discourse.

    A Central and Eastern European Trumpism?

    Certain themes of the Polish election converge with a trend present throughout Central and Eastern Europe. We saw this at work in the Romanian presidential election, where the unsuccessful far-right nationalist candidate, George Simion, came to Warsaw to support Nawrocki, just as the winner, the pro-European centrist Nicușor Dan, lent his support to Trzaskowski. Nawrocki’s success reinforces an emerging “Trumpist” movement in Eastern Europe, with Viktor Orbán in Budapest seeing himself as its self-proclaimed leader. A year ago, Orbán coined the slogan “Over there (in the United States), it’s MAGA; here, it will be MEGA: Make Europe Great Again”. The “Patriots for Europe” group, launched by Orbán last year, is intended to unify this movement within the European Parliament.

    American conservative networks, through the Conservative Political Action Conference (CPAC), a gathering of international hard-right figures, and the Trump administration are directly involved in this process. Shortly before the presidential election, Nawrocki travelled to Washington to arrange a photo opportunity with Trump in the Oval Office.

    Most notably, two days before the election, Kristi Noem, the US Secretary of Homeland Security, was dispatched on a mission to Poland. Speaking at the CPAC conference in Rzeszów, she explicitly linked a vote for Nawrocki to US security guarantees for Poland:

    “If you (elect) a leader that will work with President Donald J. Trump, the Polish people will have a strong ally that will ensure that you will be able to fight off enemies that do not share your values. […] You will have strong borders and protect your communities and keep them safe, and ensure that your citizens are respected every single day. […] You will continue to have a U.S. presence here, a military presence. And you will have equipment that is American-made, that is high quality.”

    “Fort Trump”, that is how the outgoing President Andrzej Duda named the US military base financed by Poland after a bilateral agreement was signed with Donald Trump during his first term in office, in 2018. Similarly, the US House Committee on Foreign Affairs sent a letter to the President of the European Commission accusing her of applying “double standards”, pointing out that EU funds had been blocked when the PiS was in power, and claiming that European money had been used to influence the outcome of the Polish presidential election in favour of Trzaskowski. The letter was posted online on the State Department website. Prioritising the transatlantic link at the expense of strengthening Europe was one of the issues at stake in the Warsaw presidential election.

    CPAC is playing a significant role in building a Trumpist national-populist network based on rejecting the “liberal hegemony” established in the post-1989 era, regaining sovereignty from the EU, and defending conservative values against a “decadent” Europe. Beyond the Polish presidential election, the goal seems clear: to divide Europeans and weaken them at a time when the transatlantic relationship is being redefined.

    Jacques Rupnik ne travaille pas, ne conseille pas, ne possède pas de parts, ne reçoit pas de fonds d’une organisation qui pourrait tirer profit de cet article, et n’a déclaré aucune autre affiliation que son organisme de recherche.

    ref. Poland, divided between Trump and the EU – https://theconversation.com/poland-divided-between-trump-and-the-eu-260007

    MIL OSI Analysis

  • MIL-OSI USA: Curiosity Blog, Sols 4580-4581: Something in the Air…

    Source: NASA

    Written by Scott VanBommel, Planetary Scientist at Washington University in St. Louis
    Earth planning date: Monday, June 23, 2025
    Curiosity was back at work on Monday, with a full slate of activities planned. While summer has officially arrived for much of Curiosity’s team back on Earth, Mars’ eldest active rover is recently through the depths of southern Mars winter and trending toward warmer temperatures itself. Warmer temperatures mean less component heating is required and therefore more power is freed up for science and driving. However, the current cooler temperatures do present an opportunity to acquire quality short-duration APXS measurements first thing in the morning, which is what Curiosity elected to do once again.
    Curiosity’s plan commenced by brushing a rock target with potential cross-cutting veins, “Hornitos,” and subsequently analyzing it with APXS. A sequence of Mastcam images followed on targets such as “Volcán Peña Blanca,” “La Pacana,” “Iglesia de Jarinilla de Umatia,” and “Ayparavi.” ChemCam, returning to action after a brief and understood hiatus, rounded out the morning’s chemical analysis activities with a 5-point analysis of Ayparavi. After some images of the brush, and a handful of MAHLI snaps of Hornitos, Curiosity was on its way with a planned drive of about 37 meters (about 121 feet).Curiosity’s night would not be spent entirely dreaming of whatever rovers dream, but rather conducting a lengthy APXS analysis of the atmosphere. These analyses enable Curiosity’s team to assess the abundance of argon in the atmosphere — from a volume about the size of a pop can (or soda can, depending on your unit of preference) — which can be used to trace global circulation patterns and better understand modern Mars. Recently, Curiosity has been increasing the frequency of these measurements and pairing them with ChemCam “Passive Sky” observations. These ChemCam activities do not utilize the instrument’s laser, but instead use its other components to characterize the air above the rover. By combining APXS and ChemCam observations of the atmosphere, Curiosity’s team is able to better assess daily and seasonal trends in gases around Gale crater. A ChemCam “Passive Sky” was the primary observation in the second sol of the plan, with Curiosity spending much of the remaining time recharging and eagerly awaiting commands from Wednesday’s team.

    MIL OSI USA News

  • MIL-OSI USA: Governor Newsom announces appointments 6.26.25

    Source: US State of California 2

    Jun 26, 2025

    SACRAMENTO – Governor Gavin Newsom today announced the following appointments:

    Kira Younger, of Fair Oaks, has been appointed Chief Financial Officer and Director of the Finance and Accounting Division at the California Department of Social Services. Younger has been Chief of Fiscal Forecasting at the California Department of Social Services since 2021, where she has held several roles since 2016, including Budget Officer and Staff Services Manager. She was Financial Manager at the California Office of Systems Integration from 2018 to 2019. Younger earned a Master of Business Administration degree in Strategic Management and a Bachelor of Business Administration degree in Accounting from Western Governors University. This position does not require Senate confirmation, and the compensation is $176,160. Younger is a Democrat. 

    Lauren Gavin Solis, of Los Angeles, has been appointed Deputy Director of the Office of Medicare Innovation and Integration at the Department of Health Care Services. Solis has been Acting Group Director for the Medicare-Medicaid Coordination Office at Centers for Medicare and Medicaid Services since 2025, where she has held several roles since 2013, including Team Lead and Health Insurance Specialist. She was a Health Policy Scholar at the National Coalition on Health Care from 2012 to 2013. Solis was a Presidential Management Fellow at the National Institutes of Health from 2011 to 2013. She held several roles at Triage Consulting Group from 2005 to 2010, including Legal Services Manager, Senior Associate, and Associate. Solis earned a Master of Public Health degree in Health Systems and Policy from Johns Hopkins University and a Bachelor of Arts degree in Psychology from the University of California, Davis. This position does not require Senate confirmation, and the compensation is $187,020. Solis is a Democrat. 

    Julia Parish, of Oakland, has been appointed Deputy Director of Legislation, Regulation, and Policy at the California Civil Rights Department. Parish has been a Senior Staff Attorney at Legal Aid at Work since 2019, where she has held multiple positions since 2011, including Staff Attorney, and Equal Justice Works AmeriCorps Legal Fellow. She was a Research Assistant to Professor David Oppenheimer at University of California, Berkeley School of Law in 2010. Parish earned a Juris Doctor degree from the University of California, Berkeley School of Law, a Master of Science degree in Education from Pace University, and a Bachelor of the Arts in Political Science and Spanish from University of California, Berkeley. This position does not require Senate confirmation, and compensation is $146,268. Parish is a Democrat.

    Juliet Michelson Wahleithner, of Fresno, has been appointed Director of Research, Evaluation, and Assessment at the Commission on Teacher Credentialing. Wahleithner has been a Special Consultant for the Office of Policy and Continuous Improvement at the Commission on Teacher Credentialing since 2025. Wahleithner has been an Associate Professor for Literacy Education at California State University, Fresno since 2021, where she has held several roles since 2015 including Director of Educator Preparation and Accreditation, Director of San Joaquin Valley Writing Project, and Assistant Professor. She held several roles at University of California, Davis School of Education from 2008 to 2015, including Postdoctoral Researcher, Lecturer, Accreditation coordinator, and Graduate Student Assistant. Wahleithner held several roles at Lodi Unified School District from 1999 to 2007, including Differentiated Instruction Curriculum Coach and an English and Journalism Teacher. She is a Member of the American Educational Research Association, California Council on Teacher Education, and Board of Directors of Saint Agnes Child Development Center. Wahleithner earned a Doctor of Philosophy degree in Education, a Master of Arts degree in Education, and a Bachelor of Arts degree in English from University of California, Davis. This position does not require Senate confirmation, and the compensation is $163,788. Wahleithner is a Democrat.

    Sophear Price, of Santa Rosa, has been appointed Skilled Nursing Facility Administrator at the Yountville Veterans Home of California. Price has been the Standards Compliance Coordinator at the Yountville Veterans Home of California since 2018. Price held multiple roles at the Sonoma Development Center from 2014 to 2017, including Community Programs Specialist II and Individual Programs Coordinator. She earned a Bachelor of Arts degree in Psychology from California State University, Sonoma. This position does not require Senate confirmation, and the compensation is $159,120. Price is registered without party preference. 

    Press releases, Recent news

    Recent news

    News What you need to know: La Passeggiata on Lindsey Street in Stockton is the latest site to be transformed from excess, underutilized state land into affordable housing under Governor Newsom’s executive order. STOCKTON — Today, state leaders broke ground on a new…

    News What you need to know: There are many disingenuous claims swirling about California gas prices “set to soar” – the truth is that gas prices won’t come anywhere close to increasing by 65 cents, as many would have you believe.   SACRAMENTO – California gas prices…

    News What you need to know: Governor Newsom announced $135 million is available for wildfire prevention grants – protecting communities from catastrophic wildfire at the same time as President Trump adds new strain to firefighting resources. SACRAMENTO – As President…

    MIL OSI USA News

  • MIL-OSI United Kingdom: Guidance for trade mark applicants following judgment in SkyKick v Sky

    Source: United Kingdom – Government Statements

    News story

    Guidance for trade mark applicants following judgment in SkyKick v Sky

    New guidance for trade mark applicants following Supreme Court judgment in the case of SkyKick UK Ltd and another v Sky Ltd and others.

    The Intellectual Property Office (IPO) has issued important new guidance for trade mark applicants following a Supreme Court judgment in the case of SkyKick UK Ltd and another v Sky Ltd and others.

    The updated Practice Amendment Notice (PAN 1/25) clarifies what is expected when filing specifications, and outlines changes to examination practices.

    These changes will take effect immediately.

    Important changes trade mark applicants and their representatives need to know

    The Supreme Court judgment addresses bad faith in trade mark applications, particularly concerning overly broad specifications where applicants have no intention to use the mark across all the claimed goods or services.
    Examiners will now actively consider whether specifications are “manifestly and self-evidently broad”, and may raise bad faith objections during the examination process.

    What you need to do

    Trade mark applicants should:

    • ensure specifications represent fair and reasonable claims for their business
    • be cautious when filing for large numbers of goods and services across multiple classes
    • consider whether broad terms like “computer software” or “clothing” truly reflect intended use, or whether sub-categories are more appropriate
    • be ready to explain their commercial reasons if challenged on the scope of an application

    What to expect during examination

    If examiners raise a bad faith objection, applicants will have two months to respond by either:

    • providing an explanation of their commercial reasons for the broad specification
    • restricting the goods/services to reflect their business more appropriately

    Certain applications will automatically trigger objections, including claims covering all 45 classes or all goods in Class 9. (Class 9 covers a significant range of goods related to technology, science, information processing and software).

    However, there will be other scenarios where examiners may raise objections, which will be dealt with on a case-by-case basis. Our aim is to strike a pragmatic balance, and the focus will be on manifestly and self-evidently broad specifications.

    The IPO’s Deputy CEO and Director of Services Andy Bartlett said:

    Following the Supreme Court’s judgment in the ‘Skykick’ case, we are issuing guidance to provide greater clarity and certainty for trade mark applicants and their representatives.

    The ruling represents a significant development in trade mark law, and this Practice Amendment Notice explains what is expected from applicants, and how our examination practices will change as a result.

    Understanding these changes will help our customers prepare appropriate specifications and avoid potential challenges and unnecessary delays in the application process.

    Customers requiring further information about the new guidance can get in touch with us at practicenoticequeries@ipo.gov.uk.

    When applying to register a trade mark, customers may wish to seek professional advice from a Chartered Trade Mark Attorney.

    For more information, read the full Practice Amendment Notice (PAN 1/25).

    Updates to this page

    Published 27 June 2025

    MIL OSI United Kingdom

  • MIL-OSI Russia: What do future IT specialists want: results of a survey by VK and HSE

    Translation. Region: Russian Federal

    Source: State University Higher School of Economics – State University Higher School of Economics –

    According to the survey, 8 out of 10 school graduates prepare for admission on their own, 69% participate in specialized Olympiads for this purpose, and 60% study in additional courses. Most respondents are convinced that studying at a university should give them the skills necessary for a career (82%) and, as a result, they expect to become sought-after specialists (75%).

    When choosing a university, applicants focus on the prestige of the educational institution (76%), the availability of programs in the chosen specialty (74%) and the demand for graduates in the labor market (73%).

    63% of respondents expect to continue their education in a master’s program, and 20% – on the employer’s side. After completing their studies, every second respondent (51%) plans to develop IT products and technologies, 18% – to create their own startup, and 9% – to engage in science and research. A fifth of respondents (21%) have not decided on their career plans before enrolling.

    Prospective students choose a specialty based on their personal interest in a specific area (79%), the prospects of the field (68%), and the potential for high income from work in this field (55%). The most popular IT areas among the surveyed applicants were artificial intelligence (67%), data analysis (55%), IT infrastructure development (44%), information systems development (41%), and systems programming (37%).

    More than half of future IT specialists (52%) are convinced that basic knowledge in AI is necessary regardless of their training profile. Among specific AI specialties, applicants named machine learning (67%), generative AI (55%), computer vision (42%), intelligent systems (35%) and natural language processing (34%) as the most interesting. The list of in-demand specializations among applicants also included recommender systems (29%) and ethical aspects and regulation of AI (17%).

    The survey was conducted in June 2025 among 1.3 thousand applicants planning to enroll in IT programs.

    Please note: This information is raw content directly from the source of the information. It is exactly what the source states and does not reflect the position of MIL-OSI or its clients.

    MIL OSI Russia News