NewzIntel.com

    • Checkout Page
    • Contact Us
    • Default Redirect Page
    • Frontpage
    • Home-2
    • Home-3
    • Lost Password
    • Member Login
    • Member LogOut
    • Member TOS Page
    • My Account
    • NewzIntel Alert Control-Panel
    • NewzIntel Latest Reports
    • Post Views Counter
    • Privacy Policy
    • Public Individual Page
    • Register
    • Subscription Plan
    • Thank You Page

Category: Education

  • MIL-OSI USA: Bowman, Brief Remarks on the Economy and Accountability in Supervision, Applications, and Regulation

    Source: US State of New York Federal Reserve

    Thank you for the invitation to join you here in Phoenix at the ABA’s Conference for Community Bankers.1 For the past seven years, this conference provided an excellent forum for me and bankers to meet and interact with a range of state and federal regulators, policymakers, service providers, and other stakeholders. Today I would like to share a brief update on my views on monetary policy and the economy, before I turn to bank regulatory issues, and describe how I think that regulators should approach the important work of “maintenance” of the regulatory framework.
    Economic Outlook and Monetary PolicyToward the end of last year, the Federal Open Market Committee (FOMC) began the process of moving the target range for the federal funds rate to a more neutral setting to reflect the progress made since 2023 on lowering inflation and cooling the labor market. At our September meeting, the FOMC voted to lower the target range, for the first time since we began tightening monetary policy to combat inflation, by 50 basis points to 4-3/4 to 5 percent.
    You may remember that I dissented from that decision, the first time a Fed Governor dissented from an FOMC rate decision in nearly 20 years. I preferred a smaller initial cut to begin the policy recalibration phase. I explained my reasoning in a statement published after the meeting noting that the strong economy and a healthy labor market did not warrant a larger cut. In addition, moving the policy rate down too quickly could unnecessarily risk stoking demand, potentially reigniting inflationary pressures, and could be interpreted as a premature “declaration of victory” on our price-stability mandate.
    At the most recent FOMC meeting last month, my colleagues and I voted to hold the federal funds rate target range at 4-1/4 to 4‑1/2 percent and to continue to reduce the Federal Reserve’s securities holdings. I supported this action because, after recalibrating the policy rate by 100 basis points through the December meeting, I think that policy is now in a good place, allowing the Committee to be patient and pay closer attention to the inflation data as it evolves.
    In my view, the current policy stance also provides the opportunity to review further indicators of economic activity and get further clarity on the administration’s policies and their effects on the economy. It will be very important to have a better sense of these policies, how they will be implemented, and establish greater confidence about how the economy will respond in the coming weeks and months.
    For now, the U.S. economy remains strong, with solid growth in economic activity and a labor market near full employment. Core inflation is still somewhat elevated, but has appeared to resume its downward path, and my baseline expectation has been that it will moderate further this year. Even with this outlook, there are upside risks to my baseline expectation for the inflation path.
    In 2023, the rate of inflation declined significantly, but it has taken longer to see further meaningful declines since that time. The latest consumer and producer price index reports suggest that the 12-month measure of core personal consumption expenditures inflation—which excludes food and energy prices—likely moved down to around 2.6 percent in January, which would represent a noticeable stepdown from its 2.8 percent reading in December and 3.0 percent at the end of 2023. Progress had been especially slow and uneven since the spring of last year mostly due to rising core goods price inflation.
    After increasing at a solid pace, on average, over the first nine months of last year, gross domestic product appears to have risen a bit more moderately in the fourth quarter, reflecting a large drop in the volatile category of inventory investment. In contrast, private domestic final purchases, which provide a better signal about underlying growth in economic activity, maintained its strong momentum from earlier in the year, as personal consumption rose robustly again in the fourth quarter. Following strong readings in December, retail sales and sales of motor vehicles softened in January. However, these data can be noisy around this time of the year and sales were likely affected by the cold and wintery weather last month.
    Payroll employment gains have picked up since the summer of last year and averaged a strong pace of about 240,000 per month over the past three months, with last month’s gains likely held back by the Los Angeles wildfires and the harsh winter weather. The unemployment rate edged down further to 4.0 percent in January and has moved sideways since the middle of last year, remaining below my estimate of full employment.
    The labor market appears to have stabilized in the second half of last year, after it loosened from extremely tight conditions. The rise in the unemployment rate since mid-2023 largely reflects weaker hiring, as job seekers entering or re-entering the labor force are taking longer to find work, while layoffs have remained low. The ratio of job vacancies to unemployed workers has remained close to the pre-pandemic level in recent months, and there are still more available jobs than available workers. The labor market no longer appears to be especially tight, but wage growth remains somewhat above the pace consistent with our inflation goal.
    The recent revision of the Bureau of Labor Statistics labor data further vindicates my view that the labor market was not weakening in a concerning way during the summer of last year. Although payroll employment gains were revised down considerably in the 12 months through March 2024, job gains were little revised, on net, over the remainder of last year. It is crucial that U.S. official data more accurately capture structural changes in labor markets in real time, so we can confidently rely on these data for monetary and economic policymaking. But in the meantime, given conflicting economic signals, measurement challenges, and significant data revisions in recent years, I remain cautious about taking signal from only a limited set of real-time data releases.
    Assuming the economy evolves as I expect, I think that inflation will slow further this year. As the inflation data since the spring of last year show, its progress may be bumpy and uneven, and progress on disinflation may take longer than we would hope. I continue to see greater risks to price stability, especially while the labor market remains strong.
    With encouraging signs that geopolitical tensions may be abating in the Middle East, Eastern Europe, and in Asia, I will be monitoring global supply chains which could continue to be susceptible to disruptions, and lead to inflationary effects on food, energy, and other commodity markets. In addition, the release of pent-up demand following the election could lead to stronger economic activity, which could also influence inflationary pressures.
    Having entered a new phase in the process of moving the federal funds rate toward a more neutral policy stance, there are a few considerations that lead me to prefer a cautious and gradual approach to adjusting policy, as it provides us time to assess progress in achieving our inflation and employment goals.
    Given the current policy stance, I think that easier financial conditions from higher equity prices over the past year may have slowed progress on disinflation. And I am watching the increase in longer-term Treasury yields that has occurred since the start of policy recalibration at the September meeting. Some have interpreted it as a reflection of investors’ concerns about inflation risks and the possibility of tighter-than-expected policy that may be required to address inflationary pressures.
    There is still more work to be done to bring inflation closer to our 2 percent goal. I would like to gain greater confidence that progress in lowering inflation will continue as we consider making further adjustments to the target range. We need to keep inflation in focus while the labor market appears to be in balance and the unemployment rate remains at historically low levels. Before our March meeting, we will have received one additional month of inflation and employment data.
    Looking forward, it is important to note that monetary policy is not on a preset course. At each FOMC meeting, my colleagues and I will make our decisions based on the incoming data and the implications for and risks to the outlook and guided by the Fed’s dual-mandate goals of maximum employment and stable prices. I will also continue to meet with a broad range of contacts to help me interpret the signals provided by real-time data and as I assess the appropriateness of our monetary policy stance.
    Bringing inflation in line with our price stability goal is essential for sustaining a healthy labor market and fostering an economy that works for everyone in the longer run.
    Maintenance of the Regulatory FrameworkI will now turn to bank supervision, the bank applications process, and regulation. Community banks experience the burden of the regulatory framework most acutely when it is not appropriately tailored to their size, risk, complexity, and business model. While promoting safety and soundness in the banking system—particularly among community banks—is an important and necessary regulatory objective, we must also be cautious to ensure that the framework does not become an impediment to their operations, preventing them from providing competitive products and services, innovating, and engaging in appropriate risk-taking.
    During my tenure at the Board, I have laid out a wide range of issues and concerns that I see as critical components that are necessary to build and maintain an effective regulatory framework.2 While I will only address a subset of these issues today, I’d like to begin by clarifying what I mean by this.
    Our work to maintain an effective framework is never really complete. Just as complacency can be fatal to the business of a bank, complacency can also prevent regulators from meeting their statutory obligation to promote a safe and sound banking system that enables banks to serve their customers effectively and efficiently.
    System maintenance is not something that we should shy away from. In our everyday lives, we invest significant time in maintenance. We schedule regular oil changes for our cars, and we invest in the infrastructure that allows our economy to function. Devoting resources to maintenance often prevents more costly issues down the road—it’s easier to get oil changes than it is to rebuild an engine.
    So, what does maintenance look like in practice? To address this question, I think it’s helpful to look at three core areas in the bank regulatory framework: Supervision, Bank applications, and Regulation.
    Approach to SupervisionLet’s start with supervision. Supervision operates almost entirely outside of the public view. Much of the work involves the review of proprietary business information from banks, and the preparation of examination reports shielded from public scrutiny under the auspices of protecting confidential supervisory information. But confidentiality should not be used to prevent scrutiny and accountability in the assignment of ratings.
    So, today, I am going to dig a bit deeper into the realm of supervision to discuss supervisory ratings, accountability, and the troubling trend of inaction and opacity within the supervisory toolkit.
    Rational Standards & RatingsWhile there is some public disclosure of supervisory information, it is often difficult to get a true understanding of supervision based on data that may be released. In fact, this data often sends confusing and conflicting signals. For example, the Board’s Supervision and Regulation Report presented information stating that only one-third of large financial institutions maintained satisfactory ratings across all relevant ratings components in the first half of 2024.3 At the same time, this report noted that most large financial institutions met supervisory expectations with respect to capital and liquidity.4
    The odd mismatch between financial condition and overall supervisory condition as assessed by the prudential regulators raises a more significant issue, whether subjective examiner judgment—those evaluations based on subjective, examiner-driven, non-financial concerns—is driving the firm’s overall rating. Are ratings trends based on the materiality of the identified issues, or do they imply that the regulators see widespread fragility in the banking system?
    While this example highlights a large bank ratings framework issue, it is symptomatic of a broader issue that warrants scrutiny—whether the approach to supervision has led to a world in which core financial risks have been de-prioritized, and non-core and non-financial risks—things like IT, operational risk, management, risk management, internal controls, and governance—have been over-emphasized. These issues are important, and certainly worthwhile topics for examiners to consider, but their review should not come at the expense of more material financial risk considerations—and they should not drive the overall assessment of a firm’s condition. There is evidence that supervision has undergone such a shift, not only among large banks, but among regional and community banks as well.5 For all institutions, financial metrics are not among the primary findings determined from the examination process, and arguably they have been de-emphasized when assigning supervisory ratings.
    Prioritization is valuable in the supervisory process, both to inform how examiners allocate their time, but also in helping banks allocate resources to remediate issues identified during the supervisory process. The frequency of supervisory findings related to non-financial metrics may be a byproduct of how long it takes to remediate these issues, like longstanding issues with IT systems that have not been enhanced over many years of growth. However, we should also be vigilant and deliberate about any shift in supervisory focus from financial risk toward non-financial risks and internal processes, as this shift is not focused on fundamental safety and soundness issues and it is not cost-free.
    We should also not expect every firm to coalesce around a single set of products, internal processes, and risk-management practices. Variety in banking models is a strength and a necessity of the U.S. banking system, relying on management and boards of directors to determine bank strategy, rather than a bank’s business model effectively being set by supervisory directives.
    Supervisory practices like horizontal reviews can create examiner incentives to expect uniformity and “grade on a curve,” but this approach perversely punishes variation among bank practices, stifling competition and innovation. Supervisory findings also inform bank ratings, which can have follow-on effects like limiting options for mergers and acquisitions (M&A); raising the cost of liquidity; or diverting resources away from other, more important bank management priorities.
    Diagnostic AccountabilityTo maintain strong and appropriate supervisory standards and practices, we need to take a step back and diagnose the bank regulatory system in its entirety: what is working, what is broken, and what needs to be updated. When things go wrong, having an impartial check on subjective judgments can lead to a better diagnosis. Of course, a better diagnosis can produce more efficient and targeted improvements, and better promote accountability. Accountability is critical to maintaining an effective regulatory system, and yet it can be difficult to establish a regulatory culture that includes mechanisms to promote accountability for supervisors and regulators.6
    At every organizational level, from examiners to agency leadership, judgments are made that contribute to the overall effectiveness of the supervisory process. Reserve Bank examiners play a critical role in examining Fed-regulated institutions, both banks and holding companies. The Federal Reserve exercises its supervisory responsibilities by supervisory portfolio, with each portfolio relying on a combination of Board and Reserve Bank staff.7 But this split allocation of responsibility should not diminish the accountability for supervisory decision making. Responsibility for supervisory decisions must be coupled with accountability for these decisions. The misalignment of responsibility and accountability limits our ability to conduct effective supervision.
    This division of responsibility can pose a challenge to accountability. In the aftermath of the bank failures in 2023 and the broader stress to the banking system, the Board and other agencies proposed a variety of regulatory reform measures to remediate and address identified issues, based on internal reviews of the failures and banking stress. While I applaud efforts to hold ourselves accountable, we must ensure that self-reviews are credible, both in the causes they identify and in the reform agenda that they are used to support. An internal review process poses the temptation to avoid responsibility by assigning blame elsewhere, even when the review may be motivated by good intentions and with the outward appearance of impartiality.
    Many of the core problems in the lead-up to the bank failures involved well-known, core banking risks—interest rate risk, liquidity risk, and poor risk management. But if we look at the subsequent reform agenda, we see that the policy emphasis has been on broader regulatory changes rather than addressing supervisory program deficiencies. In my mind, this highlights the need to have a process that challenges the subjective judgments of those that were involved in oversight, not only in performing the diagnostics, but evaluating how identified issues can best be remediated.
    Purging Inaction and Opacity from the Supervisory ToolkitSupervision differs significantly from the regulatory process. Implementing new regulations, or amending existing ones, requires a public notice and comment process established by the Administrative Procedure Act. When done appropriately, regulations require regulators to “show their work” by providing extensive analytical and factual support for proposals and final rules and soliciting comment from the public and addressing those comments before finalizing a regulation. In contrast, the execution of bank examinations and the issuance of supervisory guidance lack these procedural safeguards, instead relying heavily on discretion and judgment with far lower standards for justifying actions taken with factual and analytical support under the veil of confidential supervisory information. The greater flexibility afforded in the supervisory process can lead to poor outcomes, often caused by the temptation to use inaction and opacity as supervisory tools. In my view, these tools, inaction and opacity, are not appropriate and must be subject to appropriate scrutiny or purged from the toolkit altogether.
    First let’s consider inaction. The exam process requires open communication between examiners and banks. Often interpretive questions arise during the exam process; how do existing rules and statutes apply in a particular circumstance? These questions arise when existing rules and guidance are unclear, which is a frequent occurrence. For example, how can a bank operate in a safe and sound manner while offering a new product or service, or when serving customers in particular business lines with unique needs? Banks go to great effort to meet all applicable requirements and regulatory expectations, and regulators should welcome banks seeking supervisory input and relying on a compliance-focused mindset.
    Open communication with regulated banks is a hallmark of good supervision, but regulators must live up to their end of the bargain by not leaving banks in “limbo” for extended periods of time. When a bank requests feedback and engages in good faith to provide information and respond to reasonable questions, regulators have an obligation to provide a clear response. Banks should not be left to wonder whether an interpretation of existing laws, regulations, and guidance is consistent with the understanding of regulators.
    Next, let’s consider opacity. Questions raised in the supervisory channel often result from supervisory expectations that lack sufficient clarity or the application of rules and regulations to new and emerging products and services. While regulators should not form an opinion without understanding the relevant facts and circumstances, they must also strive to provide clarity—not just to the bank being examined, but to all banks. Supervisory expectations should not surprise regulated firms, and yet transparency around expectations is often challenging to achieve.8
    The problem of opacity in supervisory expectations is exacerbated by the umbrella of confidential supervisory information, or CSI, which is the label given to most materials developed in the examination process. The rules designed to protect CSI limit the public’s visibility into shifting priorities and expectations in the supervisory process.9 Changes in supervisory expectations frequently come without the benefit of guidance, advance notice, or published rulemaking. In the worst-case scenario these shifts, cloaked by the veil of supervisory opacity, can have significant financial and reputational impacts or can disrupt the management and operations of affected banks.
    Opacity in supervisory expectations, or in the interpretation of applicable laws and regulations, should not be discovered only at the conclusion of an examination with the issuance of deficiencies, matters requiring attention, matters requiring immediate attention, or other shortcomings.
    Approach to ApplicationsSunshine is the best disinfectant when it comes to an approach that fosters transparency and accountability. So, I would like to spend a few minutes discussing how we can better shine a light into the dark corners of the bank applications process.
    De Novo FormationDe novo formation has essentially stagnated over the past several years. While many factors have contributed to the decline in the aggregate number of banks in the United States, one key factor has been the lack of new bank formation to replace banks that have been acquired or closed their doors. This lack of de novo bank approvals does not necessarily indicate a lack of demand for new charters though, particularly in light of ongoing demand for bank “charter strip” acquisitions where banks have been acquired just for their charters, the growing demand for banking-as-a-service partnerships, and the shift of activities outside of the banking sector into the non-bank financial system.10 We should consider whether the applications process itself has become an unnecessary impediment to de novo formation.
    How can we improve the process of de novo formation? As fewer applications come in, institutional muscle memory for how to deal with new bank charters erodes, and it becomes difficult to navigate and ultimately to overcome institutional inertia. A few steps like developing specialized expertise, streamlining the application process, and improving transparency can yield significant improvements.
    First, de novo formations are very different from other bank applications where there are existing institutions with established supervisory ratings and examination records. A de novo formation has no supervisory record of performance on which to base a decision or inform judgments about whether an application is consistent with approval. Instead, regulators must evaluate the proposal based on applicable statutory requirements: Is the business plan sound? Is appropriate bank leadership in place? Does the bank have a viable business plan and strategy? Is the bank’s proposal supported by sufficient capital? Should there be an expectation that all of these questions are answered exhaustively often well over a year before the bank would be formed, if it is approved?
    In recent years de novo formations have been rare, and therefore staff tasked with evaluating these proposals do not have a recent perspective or deep well of experience from which to draw. Under our current approach, regional Reserve Banks are the primary point of contact for de novo applicants. We should consider creating a specialized resource that can be utilized by any reserve bank to assist them during the pre-filing conversations with de novo applicants. Our goal should be to facilitate new bank creation—identifying and finding achievable pathways to yes, instead of expecting and insisting on increasing requirements to unachievable levels or those that are intended to deter applicants from filing or moving forward.
    We should also consider whether there are ways to streamline the application process, including, if needed, by recommending statutory changes. While the agencies use some common forms, de novo formations currently involve a range of regulatory approvals. A de novo applicant must apply for a bank charter from the Office of the Comptroller of the Currency or a state banking authority, deposit insurance from the Federal Deposit Insurance Corporation, and potentially membership or a parallel holding company formation application with the Federal Reserve.
    Each regulator may be focused on different aspects of the application, and each has the right to ask for additional information as part of the application review and analysis potentially significantly extending the review timeframe. We should have clear standards of review and approval—and coordinated actions—among the state and federal regulators involved in any application. This should include clear timelines for the point at which a regulator forfeits their opportunity to object due to inaction, delay, or stalling tactics.
    But standards for de novo approval are not always clear to applicants, which can lead to lengthy back-and-forth discussions with banking agency staff even after an applicant has prepared the information required by the appropriate application forms. The need for extensive additional information from de novo applicants can be caused by a failure to provide information requested in the application form, but I suspect the submission of incomplete information is often a product of forms that do not include all necessary information.
    We should not need to constantly supplement application forms with ad hoc information requests. If additional information is needed, we should modify the required application forms. One area where the lack of transparent and clear standards is most evident is with the amount of capital required to establish a de novo bank. Discussions around required capital often hinge on subjective assessments based on planned business model and growth, but they rarely involve regulators providing a minimum required capital amount. Standards for approval should not be shrouded in mystery.
    Reform of the de novo applications process should not be thought of as a deregulatory exercise. Clear and transparent standards do not imply “low” or inadequate standards. At the same time, if we want to encourage a pipeline of de novo bank formations, we should also be comfortable with the uncertainty that accompanies any new business, including the risk that some de novo banks will not succeed.
    The cost of eliminating the failure of de novo banks—or really of any banks at any time—is simply too great. Banking is fundamentally about appropriately managed risks, and regulators play a key role in promoting a system that is safe and sound while also serving to support the banking needs of customers and broader economic growth. Our goal should not be to create a banking system that is safe, sound, and ultimately irrelevant.
    Mergers and AcquisitionsThe issues with the banking applications process extend beyond de novo formations, but involve some of the same concerns, whether there are clear standards, and we are able to act in a timely manner. As a threshold matter, if regulators are clear about the information they need to process an application—for example, by updating applications forms to include the full set of information needed to analyze each statutory approval requirement—then we should also hold ourselves to fixed approval timelines. In my view, the purgatory of a long application process is another form of regulatory “inaction” that must be eliminated.
    We should also address aspects of the applications process that contribute to delay, including both the approach to competition and the public comment process.
    The banking agencies have long relied on competitive “screens” to evaluate the pro forma effect of a merger. This process looks at the standalone institutions, imagines a merger in which their operations are combined, and then looks at how measures of competition will change in the areas served by the merged institutions. Where there is overlap in markets served, there is the potential for tripping competitive screens and triggering additional scrutiny. At the Federal Reserve, when a competitive screen is triggered the application process takes more time, as staff reviews the conflict, and the matter is removed from the Reserve Bank-delegated processing track.
    Perversely, many banks that trigger additional scrutiny operate in rural markets and have less aggregate banking business over which institutions can compete. In these concentrated markets, the analytical approach may involve a counterfactual in which only two future states of the world exist—the banks continue to operate on a standalone basis, or the banks merge and operate as a consolidated whole. However, this framing ignores a possible third option, that one or both of the institutions will cease being viable and shut its doors, or be acquired by a credit union, similarly leading to an erosion of market competition and potentially greater disruption to the communities served. This analytical approach to evaluating competition no longer remains appropriate, and it needs to be reformed to better reflect actual market realities. This must include competition from credit unions, the farm credit system, internet banks, financial technology firms and other non-banks.
    Finally, many M&A applications come to the Board due to the receipt of an adverse comment from the public about the past supervisory record of one or both of the institutions involved in a merger. The receipt of an adverse comment causes substantial delays in the processing of an application, as this too removes an application from the “delegated” processing by the local Federal Reserve Bank, escalating the matter to the Board of Governors in D.C. While it is important that regulators take into account public feedback—and indeed, is required by applicable law—we should also be concerned about comments that may lack factual support or may solely rely on matters always considered in the review of a proposal, like the existing supervisory records of the acquirer and the target institution, and may be negated by the regulator’s own examination report.
    Approach to Regulation – Cleanup and the Statutory Regulatory ReviewSince the passage of the Dodd-Frank Act nearly 15 years ago, the body of regulations that all banks are subject to has increased dramatically. Many of the reforms made after the 2008 financial crisis were important and essential to ensuring a stronger and more resilient banking system. Yet, a number of the changes are backward looking—responding only to that mortgage crisis—not fully considering the potential future unintended consequences or future states of the world.
    With well over a decade of change in the banking system now behind us post-implementation, it is time to evaluate whether all these changes continue to be relevant. Some of the regulations put in place immediately after that financial crisis resulted in pushing foundational banking activities out of the banking system into less regulated corners of the financial system. We need to ask whether this is appropriate. These tradeoffs are complicated, and we must consider not only the changes that were made but also the evolution of and differences in the banking system today.
    Driving all risk out of the banking system is at odds with the fundamental nature of the business of banking. Banks, after all, are businesses. And they must be able to earn a profit and grow while also managing their risks. Adding requirements that impose more costs must be balanced with whether the new requirements make the correct tradeoffs between safety and soundness and enabling banks to serve their customers and run their businesses. The task of policymakers and regulators is not to eliminate risk from the banking system, but rather to ensure that risk is appropriately and effectively managed.
    In a well-functioning and appropriately regulated banking system, banks serve an indispensable role in credit provision and economic stability. The goal is to create and maintain a system that supports safe and sound banking practices, and results in the implementation of appropriate risk management. No efficient banking system can eliminate all bank failures. But well-designed and well-maintained systems can limit bank failures and mitigate the harm caused by any that occur.
    Maintenance of the regulatory framework is necessary to ensure that our regulations continue to strike the right balance between encouraging growth and innovation, and safety and soundness. One easily identifiable way to achieve this is using the Economic Growth and Regulatory Paperwork Reduction Act (EGRPRA) review process, which the agencies initiated in February last year.
    Although to-date it has not done so, the EGRPRA review requires the federal banking agencies to identify any outdated, unnecessary, or overly burdensome regulations and eliminate unnecessary regulations and take other steps to address the regulatory burdens associated with outdated or overly burdensome regulations. As I noted, prior iterations of the EGRPRA process have been underwhelming in their ability to result in meaningful change, but it is my expectation that this review, and eventually the accompanying report to Congress, will provide a meaningful process for stakeholders and the public to engage with the banking agencies in identifying regulations that are no longer necessary or are overly burdensome. It is also my expectation that regulators will be responsive to concerns raised by the public.
    Another area that is ripe for review are several of the Board’s rules that address core banking issues—from loans to insiders, to transactions with affiliates, to state member bank activities, and holding company requirements. Many of the Board’s regulations have not been comprehensively reviewed or updated in more than 20 years. Given the dynamic nature of the banking system and how the economy and banking and financial services industries have evolved over that period, it is imperative that we update and simplify many of the Board’s regulations, including thresholds for applicability and benchmarks.
    Finally, I want to address the unintended consequences of anti-money laundering requirements in the provision of banking services. I think we can agree that fighting money laundering, terrorist financing, and other illicit activities is not only a statutory responsibility of the banking system but it also serves important public policy goals. But while the regulatory framework prescribing how banks fulfill this role is not within the Federal Reserve’s responsibilities, it is important to consider how these requirements affect the ability of banks to serve customers. For example, the threshold for currency transaction reports (CTR) was established more than 50 years ago and has not been updated or indexed to inflation during that time. Just as an example, at the time it was implemented, a fully loaded Cadillac cost less than the CTR threshold. We’ve come a long way since 1972.
    It has also created a regime of more extensive and invasive reporting of customers’ transactions that may pose little actual risks related to tracking illicit activities. This reporting regime is also not cost-free, as banks may opt to avoid banking customers that trigger high volumes of CTR reporting, or that otherwise trigger the filing of suspicious activity reports. The calibration of reporting requirements, their effect on bank customers, and the growing problem of customer “de-banking,” warrant greater public attention.
    The Federal Reserve should review the supervisory messages given to banks and their holding companies about how supervisors will evaluate and consider the bank’s risks associated with customers that are caught in the Bank Secrecy Act or Anti-Money Laundering reporting web. I am concerned that this framework is being used to downgrade a bank’s condition based on a disproportionate weighting of its compliance with these requirements in comparison to its overall condition. There are separate examinations conducted for this purpose, and they should be viewed separately, not as a cudgel for downgrading a bank’s condition through the governance and controls mechanism or management assessment.
    Closing ThoughtsThe banking system can be an engine of economic growth and opportunity, particularly when it is supported by a bank regulatory framework that is rational and well-maintained. The work of rationalizing and maintaining this system is an ongoing cycle. While my remarks today have touched on a wide range of issues that require rationalization and “maintenance,” this is by no means an exhaustive list.
    Maintaining an effective framework is not only about ensuring the existing plumbing continues to work (and making it more efficient where possible) but it also must include promoting a system that is responsive to emerging threats and the needs of the banking system. As an example, the significant increase in fraud over the past few years has not generated the strong regulatory and governmental response necessary, even though fraud can become a source of material financial risk, particularly to smaller institutions.
    Thank you again for the opportunity to share my thoughts with you today. As always, it is a pleasure to be with you!

    1. The views expressed in these remarks are my own and do not necessarily reflect those of my colleagues on the Board of Governors of the Federal Reserve System or the Federal Open Market Committee. Return to text
    2. See, e.g., Michelle W. Bowman, “Bank Regulation in 2025 and Beyond (PDF)” (speech at the Kansas Bankers Association Government Relations Conference, Topeka, Kansas, February 5, 2025); Michelle W. Bowman, “Approaching Policymaking Pragmatically (PDF)” (speech at the Forum Club of the Palm Beaches, West Palm Beach, Florida, November 20, 2024); Michelle W. Bowman, “Building a Community Banking Framework for the Future (PDF)” (speech at the 2024 Community Banking Research Conference, St. Louis, Missouri, October 2, 2024); Michelle W. Bowman, “The Future of Stress Testing and the Stress Capital Buffer Framework (PDF)” (speech at the Executive Council of the Banking Law Section of the Federal Bar Association, Washington, D.C., September 10, 2024); Michelle W. Bowman, “Liquidity, Supervision, and Regulatory Reform (PDF)” (speech at “Exploring Conventional Bank Funding Regimes in an Unconventional World,” Dallas, Texas, July 18, 2024); Michelle W. Bowman, “The Consequences of Bank Capital Reform (PDF)” (speech to the ISDA Board of Directors, London, England, June 26, 2024); Michelle W. Bowman, “Innovation in the Financial System (PDF)” (speech at the Salzburg Global Seminar on Financial Technology Innovation, Social Impact, and Regulation: Do We Need New Paradigms?, Salzburg, Austria, June 17, 2024); Michelle W. Bowman, “Bank Mergers and Acquisitions, and De Novo Bank Formation: Implications for the Future of the Banking System (PDF)” (remarks at A Workshop on the Future of Banking, Kansas City, Missouri, April 2, 2024); Michelle W. Bowman, “Tailoring, Fidelity to the Rule of Law, and Unintended Consequences (PDF)” (speech at the Harvard Law School Faculty Club, Cambridge, Massachusetts, March 5, 2024); Michelle W. Bowman, “The Role of Research, Data, and Analysis in Banking Reforms (PDF)” (speech at the 2023 Community Banking Research Conference, St. Louis, Missouri, October 4, 2023). Return to text
    3. See Board of Governors of the Federal Reserve System, Supervision and Regulation Report (PDF) at 16-17 (Washington: Board of Governors, November 2024), (describing data for the first half of 2024, the most recent period for which data is available). Return to text
    4. Board of Governors of the Federal Reserve System, Supervision and Regulation Report. Return to text
    5. Board of Governors of the Federal Reserve System, Supervision and Regulation Report at 17, 20. Return to text
    6. See Michelle W. Bowman, “Accountability for Banks, Accountability for Regulators (PDF)” (Essay published in Starling Insights, February 13, 2024). Return to text
    7. “Understanding Federal Reserve Supervision,” Board of Governors of the Federal Reserve System, last modified April 27, 2023. Return to text
    8. See Michelle W. Bowman, “Approaching Policymaking Pragmatically (PDF)” (speech at the Forum Club of the Palm Beaches, West Palm Beach, Florida, November 20, 2024). Return to text
    9. See Michelle W. Bowman, “Reflections on the Economy and Bank Regulation (PDF)” (speech at the New Jersey Bankers Association Annual Economic Leadership Forum, Somerset, New Jersey, March 7, 2024). Return to text
    10. See Michelle W. Bowman, “The Consequences of Fewer Banks in the U.S. Banking System (PDF)” (speech at the Wharton Financial Regulation Conference, Philadelphia, Pennsylvania, April 14, 2023). Return to text

    MIL OSI USA News –

    February 18, 2025
  • MIL-OSI United Kingdom: Response to international conflict shaped by University Assembly How the University of Aberdeen should respond to international conflict was the subject of in-depth debate at a groundbreaking event on campus last week.

    Source: University of Aberdeen

    Professor Paul Gready, Claire Hajaj, Dr Rebekah Widdowfield and Professor Jo-Anne Murray at the Aberdeen University Assembly on International ConflictHow the University of Aberdeen should respond to international conflict was the subject of in-depth debate at a groundbreaking event on campus last week.
    A University Assembly was held on Friday, 14 February which saw more than 30 delegates, comprising both students and staff, discuss possible University responses to international conflict.
    The Assembly, held at King’s Pavillion, was announced last year following discussions in Senate around conflict issues and the encampment on Elphinstone Lawn to seek input and guidance from students and staff on this challenging issue facing the University and our community.
    During the half-day event, which was hosted by Professor Jo-Anne Murray, Vice-Principal (Education), delegates heard from speakers Claire Hajaj, a specialist in conflict and post-conflict dynamics, and Professor Paul Gready, Co-Director of the Centre for Applied Human Rights (CAHR) at the University of York.
    Dr Rebekah Widdowfield, Vice-Principal for People & Diversity at the University of St Andrews, facilitated a broad-ranging discussion for delegates in the final session.
    Professor Jo-Anne Murray commented: “The University Assembly was a very special and positive event which allowed students and staff to express their views on how we can respond to international conflicts and what we can do to address them at a local level.
    “The delegates participated in a constructive way to discuss a very challenging and sensitive topic, sometimes with opposing views but always with the aim of finding common ground and it was pleasing to see the emergence of actions the University can take forward.”

    We’re pleased that the University is taking this approach and is open to collaboration, allowing for a lively and meaningful discussion. This event and the next steps will give everyone the opportunity to share their views and have a direct influence on the University’s response to international conflicts.” Christina Schmid, Student President

    A report summarising the outcomes of the Assembly, and proposed next steps, will be published shortly, with a review on progress in a year’s time.
    The Assembly format originated in Ireland as a form of participative democracy to provide real insights into complex issues.  The model has also been applied, including at Aberdeen, in the form of Climate Assemblies.  Professor David Farrell, University College Dublin, provided expert guidance in designing the event based on his experience of delivering and researching the Irish Citizens’ Assembly model.  Although he was unable to attend the event, he provided valuable advice to delegates on creating a ‘safe space’ within which views can be shared via a recorded video message.
    Nick Edwards, Assembly Co-Lead, Deputy Director of People, said: “International conflicts affect all of us in many ways and social media brings it into our homes in a way that was not possible before.
    “The Assembly format encourages all participants to express their views and help to shape the University’s response. For me, the strength of this approach is allowing members of our community to directly engage in discussions on these important topics, and I hope it is an approach we can refine and use again in the future.”
    A key part of the Assembly was the involvement of students in the design, delivery and support of the event over several months.
    Christina Schmid, Student President, Aberdeen University Students’ Association, said: “The Assembly was an important event, and it was encouraging to see students at the heart of its planning and delivery. We’ve always believed it’s crucial that students’ voices are not just heard but genuinely respected and valued in these discussions—not just as a token gesture.
    “We’re pleased that the University is taking this approach and is open to collaboration, allowing for a lively and meaningful discussion. This event and the next steps will give everyone the opportunity to share their views and have a direct influence on the University’s response to international conflicts.”
    Related Content

    MIL OSI United Kingdom –

    February 18, 2025
  • MIL-OSI United Kingdom: expert reaction to EDX medical press release giving topline findings on a new prostate cancer screening test

    Source: United Kingdom – Science Media Centre

    February 17, 2025

    Scientists comment on a press release from EDX that gives findings on a new screening test for prostate cancer. 

    Prof Derek Rosario, Consultant Urological Surgeon, Honorary Professor and Clinical Advisor (Prostate) to the UK National Screening Committee, said:

    “As far as I can tell from the information in the press release from EDX Medical, there have been no prospective clinical studies of this ‘super test’. The test relies on an algorithm to combine information from around 100 previously ‘validated’ biomarkers in blood and urine. To what extent these biomarkers are feeding in additional information and how the algorithm will work in clinical practice has yet to be determined. The most telling statement to me is … “EDX Medical scientists expect the test to consistently deliver exceptionally high accuracy with levels of sensitivity and specificity of between 96-99% across an extended age-range and diverse ethnic groups. By comparison, current standard of care prostate testing, including prostate specific antigen (PSA) tests and biopsies, can be below 50%.EDX Medical’s scientific team will validate further clinical data over coming months prior to seeking regulatory approval from the Medicines & Healthcare products Regulatory Agency (MHRA) and the US Food and Drug Administration (FDA) with a view to launching the test later this year or early 2026.” So, there is an expectation that this test will be effective, but as far as I can tell these claims have not been demonstrated with a clinical study as yet. The test needs to be prospectively validated – I’m not sure whether I have missed the original literature on this, but we need more information than is currently provided by the press release to be able to validate the claims. To what extent this test will outperform something like the Stockholm 3 (a blood test that estimates the risk of prostate cancer in men) remains to be seen. A test that has both a sensitivity and specificity of 96-99% would be truly unusual in clinical practice – usually there is a trade-off to be had between the two, so that statistic does not quite make sense to me, though I would need to see the data underlying these claims to make a final judgement, but it is not yet provided.”

    Professor Ros Eeles, Professor of Oncogenetics at The Institute of Cancer Research, London, and Consultant in Clinical Oncology and Cancer Genetics at The Royal Marsden NHS Foundation Trust, said:

    “The development of new biomarker tests for early detection of prostate cancer is an important area of research to increase the number of prostate cancer cases found at an earlier stage and to prevent deaths from prostate cancer.

    “However, it is very important to show that any new biomarker tests do indeed improve earlier diagnosis and such tests need trials to determine this. While the biomarkers used in this test have been validated, this particular combination of markers has not yet been shown to detect cancer at an earlier stage and prevent deaths.

    “The TRANSFORM trial – led by six researchers including myself – will assess several approaches to earlier detection of prostate cancer in hundreds of thousands of men, including genetic risk stratification, imaging techniques and biomarkers.”

    Prof Freddie Hamdy, Nuffield Professor of Surgery, Professor of Urology, University of Oxford, said:

    “The fact that there is nothing published on the test does not necessarily mean they have not validated it already. They claim: “Individually, these biomarkers have all been clinically validated and published and in previous trials on more than 31,000 positive prostate cancer samples as well as more than 100,000 control non-cancer samples.” So we have to assume that they have already done this, we just don’t know the data and the nature of the cohorts on which the test was validated, and we don’t know if it has been peer-reviewed and I tried to find published literature but couldn’t. They also admit the test needs further validation.

    “They claim both high sensitivity/specificity AND accurate risk prediction. But increasing the diagnosis of prostate cancer in itself is not a desirable achievement unless it detects ‘important’ disease, i.e. clinically significant, and this is where the problem lies. How did they define ‘risk prediction’? Urologists themselves are revisiting what ‘clinical significant’ prostate cancer means. So for example if the ‘bar’ was the detection of any cancer with Gleason Grade Group 2 as the threshold, it is fraught with problems because it will continue to increase over-diagnosis and over-treatment.”

    Press release: https://edxmedical.co.uk/product/a-new-comprehensive-prostate-cancer-screening-test/

    Declared interests

    No reply to our request for DOIs was received.

    MIL OSI United Kingdom –

    February 18, 2025
  • MIL-OSI: VEEA® and VAPOR IO Announce a Strategic Partnership to Provide Turnkey AI-as-a-Service Pioneering Solutions for AI Inferencing, Federated Learning, Agentic AI and AIoT

    Source: GlobeNewswire (MIL-OSI)

    Visit us at Mobile World Congress in Barcelona, Spain, March 3-6, 2025, for demonstrations
    By appointment (marketing@veea.com) in Hall 6, Stand 6A or on M37 Yacht in Port Vell, Barcelona

    NEW YORK, Feb. 17, 2025 (GLOBE NEWSWIRE) — Veea Inc. (NASDAQ: VEEA), a pioneer in hyperconverged heterogenous Multiaccess Edge Computing (MEC) with AI-driven cybersecurity and edge solutions and Vapor IO, the leading developer of Zero Gap™ AI for zero-configuration data centers enabling comprehensive training utilizing a catalog of state of the art models, delivering ultra-low latency AI inferencing with private 5G networks across distributed edge locations, announced a partnership to offer turnkey AI-as-a-Service (AIaaS) to enterprises, municipalities and others without investing in capital-intensive edge devices, servers, networking equipment and data center facilities.

    For enterprise applications, such as Smart Manufacturing, Smart Warehouses, Smart Hospitals, Smart Schools, Smart Construction, Smart Infrastructure, and many others, Veea Edge Platform™ collects and processes the raw data at the Device Edge, where user devices, sensors and machines connect to the network, most importantly, for reasons of low-latency, data privacy and data sovereignty. VeeaWare® full stack software running on VeeaHub® devices and on third-party hardware solutions with GPUs, TPUs or NPUs, such as NVIDIA AGX Orin and Qualcomm Edge AI Box-based hardware on a Veea computing mesh, provide for the full gamut of AI inferencing with cloud-native edge applications and AI-driven cybersecurity with bespoked Agentic AI and AIoT for the specific use cases. Combined with its VeeaCloud management functions, AIoT platform and extension of network slicing through the LAN with SDN and NFV, Veea Edge Platform offers an unrivaled capability for AI inferencing for enterprise use cases at the edge.

    The core of Vapor IO’s Zero Gap AI is built around Supermicro MGX servers with the NVIDIA GH200 Grace Hopper Superchip for high-performance accelerated computing and AI applications. The Zero Gap AI makes it possible to simultaneously deliver AI inferencing and train complex models while supporting 5G private networks, including NVIDIA Aerial-based 5G private network services. Through a PoC together with Supermicro and NVIDIA in Las Vegas, Vapor IO demonstrated how Zero Gap AI customers can receive the benefits of AI inferencing for a range of use cases including by those in mobile environments with the highest level of performance and reliability that may be achieved today. For low-latency use cases, Zero Gap AI is offered as high-performance micro data centers, strategically placed in close proximity where AI inferencing is delivered. Zero Gap AI offering provides for the AI tools, libraries, SDKs, pre-trained models, frameworks and other components that may optionally be employed to develop AI apps.

    “AI represents a new class of software. Just as computing evolved from the client-server architectures to more decentralized models, for most enterprise applications AI will inevitably migrate to the edge sooner rather than later—driven by the need for data sovereignty, real-time processing, lower latency, enhanced security, and greater autonomy. The future of AI is on the edge, where intelligence meets efficiency,” stated Allen Salmasi, co-founder and CEO of Veea. “As the first PCs brought general computing to business customers first, through the partnership with Vapor IO, we intend to accomplish the same by streamlining the application of AI where data is generated at the edge. By integrating scalable computing, storage, hyperconverged networking and AI-driven cybersecurity into a unified system with a cloud-native architecture at Device Edge and VeeaCloud management capabilities together with Vapor IO we have taken much of the uncertainty and friction out of the adoption of AI at the edge.”

    The combined capabilities of Veea Edge Platform and Zero Gap AI, offer a unified, automated platform with orchestration for seamless workload distribution, which enables a new class of collaborative, distributed AI applications as an AI-in-a-Box solution:

    • VeeaCloud management of GPU clusters – Plays a crucial role in balancing performance, scalability, and efficiency for AI inferencing, while utilizing cloud orchestration for resource optimization, model updates, and intelligent workload distribution.
    • Providing On-Demand AI Compute – Eliminates the need for enterprises to invest in costly on-prem AI hardware by offering scalable, GPU-accelerated AI compute at the edge.
    • Enabling AI at Any Scale – Supports AI workloads ranging from lightweight IoT analytics to full-scale deep learning training, ensuring enterprises can adopt AI incrementally or at full scale.
    • Harnessing Agentic AI – Integrates intelligent, autonomous decision-making capabilities that enable AI systems to adapt and optimize their performance in real-time, enhancing the effectiveness of applications across various edge environments.
    • Facilitating Federated Learning – Supports collaborative model training across distributed edge devices while maintaining data privacy, allowing enterprises to leverage insights from decentralized data sources without compromising sensitive information.
    • Supporting Model Hosting & AI Inference – Allows users to deploy, manage, and scale AI models in real-time, with low-latency inference APIs available across edge locations.
    • Offering Bare Metal and Virtualized AI Instances – Users can lease dedicated AI hardware or deploy workloads in multi-tenant GPU/CPU environments, ensuring flexibility for both small and large-scale AI applications.
    • Integrating Edge Storage & AI Data Management – Includes NVMe-based high-speed caching for inference and object storage for large-scale AI datasets, reducing reliance on cloud-based data transfers.
    • Ensuring Seamless Connectivity Options – A range of ultra-low latency connectivity options to optimize AI data transfer between on-prem devices and Edge-to-Edge compute.
    • Reducing AI Deployment Complexity – Automates AI workload orchestration, allowing businesses to expand, migrate, or failover AI models across distributed edge nodes without manual reconfiguration.
    • Accelerating Time-to-Value for AI Deployments – Provides a pre-integrated solution that reduces AI setup time from months to minutes, allowing enterprises to launch AI-powered solutions with minimal friction and on-going maintenance.

    “According to Gartner, 85% of all AI models/projects fail because of poor data quality or little to no relevant data. We have largely addressed this industry pain point most cost-effectively with much reduced complexity and little risk of disappointment through our Edge-to-Edge partnership with Veea,” explained Cole Crawford, Vapor IO’s founder and CEO. “With our substantial ecosystem of major partners and developers, we are well positioned to offer one of the most competitive turnkey real-time AI inferencing capabilities in the market with federated learning, Agentic AI and AIoT to public and private enterprises.”

    About Veea

    Veea Inc. (NASDAQ: VEEA) was formed in 2014 and is headquartered in New York City with a rich history of major innovations in the development of advanced networking, wireless and computing technologies. Veea® has unified computing, communications, edge storage and cybersecurity solutions through fully integrated cloud- and edge-managed products. Veea’s pioneering Multiaccess Edge Computing (MEC) product, developed from the ground up in several compact form factors, brings together the functionality typically provided for through any combination of servers, Network Attached Storage (NAS) devices, routers, firewalls, Wi-Fi APs, IoT gateways, 4G or 5G wireless access, and Cloud Computing by means of multiple hardware, software and systems integrated and maintained by IT/OT professionals.

    Veea Edge Platform™ is a cloud-managed full-stack platform designed to manage multi-vendor heterogeneous devices with a Linux server hosting VeeaWare stack to enable compute capabilities with any combination of GPUs, TPUs, and NPUs on a networking and computing mesh. VeeaHub products are hyperconverged, multi-access and multi-protocol devices that provide for control plane management of heterogeneous devices on any vMesh cluster. This leading-edge solution enables network slicing for seamless connectivity across diverse network environments with Network Function Virtualization (NFV) and advanced Software Defined Networking (SDN) with fixed-line and/or wireless WAN connection, including 5G. AI-driven cybersecurity and Zero Trust Network Access (ZTNA) provide for a highly simplified Secure Access Service Edge (SASE). Its integrated compute and storage support a virtualized software environment enabling cloud-native applications to run in Secured Docker™ containers. Veea Edge Platform provides for end-to-end cloud management of devices, applications and services. Veea Developer Portal and development tools provide for rapid development of edge applications. The combined capabilities with AI-driven intelligence enables unparalleled scalability, security, and operational efficiency for enterprises, IoT ecosystems, and next-gen AI applications.

    Veea has been recognized in 2021 and 2023 by Gartner for the innovativeness and capabilities of its Edge Computing platform. Veea was named a top 10 Edge AI solution provider alongside IBM, Microsoft, AWS and others in Market Reports in its research report published in October 2023. For more information, visit veea.com and follow us on LinkedIn.

    About Vapor IO
    Vapor IO stands at the forefront of the AI revolution, delivering ultra-fast and ultra-low latency solutions on- premises and across distributed edge locations with AI and private 5G networks. The company’s Zero Gap™ AI platform uniquely delivers on-demand GPUs and AI services directly to the locations where it’s needed and through Network-Delivered AI services in 36 key U.S. markets, including cities like Dallas, Las Vegas, and Seattle. Zero Gap AI uses Vapor IO’s Kinetic Grid® infrastructure, Supermicro’s AI-optimized servers, and NVIDIA’s groundbreaking AI silicon, including NVIDIA Aerial 5G private networks, to offer on-demand AI services in top U.S. markets.

    Zero Gap AI is a uniquely cost-effective way for enterprises, municipalities, and cloud providers to implement or expand their AI capabilities without investing in capital-intensive servers, networking equipment and data center facilities. Multiple AI access points in each market can be configured as availability zones, allowing for nearly unlimited degrees of resilience and continuous operating without interruption. Uniquely packaged with spectrum, highly optimized NVIDIA Aerial 5G private network services extend Zero Gap AI services to wherever they’re needed in many markets. Vapor IO’s extensive partner ecosystem can deliver specialized AI solutions built around the Zero Gap platform. From Smart City to Smart Retail, network of partners has the industry know how to build best-in-class solutions. Discover the difference Vapor IO can make with Network-Delivered AI solutions that fit your specific needs. Visit www.zerogap.ai to learn more.

    Zero Gap, Vapor, Kinetic Edge, Kinetic Grid, and Kinetic Edge Exchange are registered trademarks or trademarks of Vapor IO, Inc.

    Forward-Looking Statements
    This press release contains forward-looking statements within the meaning of Section 27A of the Securities Act of 1933, as amended (“Securities Act”) as well as Section 21E of the Securities Exchange Act of 1934, as amended, and the Private Securities Litigation Reform Act of 1995, as amended, that are intended to be covered by the safe harbor created by those sections. Forward-looking statements, which are based on certain assumptions and describe the Company’s future plans, strategies and expectations, can generally be identified by the use of forward-looking terms such as “believe,” “expect,” “may,” “will,” “should,” “would,” “could,” “seek,” “intend,” “plan,” “goal,” “project,” “estimate,” “anticipate,” “strategy,” “future,” “likely” or other comparable terms, although not all forward-looking statements contain these identifying words. All statements other than statements of historical facts included in this press release regarding the Company’s strategies, prospects, financial condition, operations, costs, plans and objectives are forward-looking statements. Important factors that could cause the Company’s actual results and financial condition to differ materially from those indicated in the forward-looking statements. Such forward-looking statements include, but are not limited to, risks and uncertainties including those regarding: the Company’s business strategies, and the risk and uncertainties described in “Risk Factors,” “Management’s Discussion and Analysis of Financial Condition and Results of Operations,” “Cautionary Note on Forward-Looking Statements” and the additional risk described in Veea’s Form 10-Q for the fiscal quarter ended September 30, 2024 and any subsequent filings which Veea makes with the U.S. Securities and Exchange Commission. You should not rely upon forward-looking statements as predictions of future events. The forward-looking statements made in the press release relate only to events or information as of the date on which the statements are made in the press release. We undertake no obligation to update or revise any forward-looking statements, whether as a result of new information, future events or otherwise, after the date on which the statements are made or to reflect the occurrence of unanticipated events except as required by law. You should read this press release with the understanding that our actual future results may be materially different from what we expect.

    The Equity Group

    Devin Sullivan
    Managing Director
    dsullivan@equityny.com

    Conor Rodriguez
    Associate
    crodriguez@equityny.com

    The MIL Network –

    February 18, 2025
  • MIL-OSI Global: Trump has purged the Kennedy Center’s board, which in turn made him its chair – why does that matter?

    Source: The Conversation – USA – By E. Andrew Taylor, Associate Professor and Director of Arts Management, American University

    Former Kennedy Center President Deborah Rutter walks by The Reach, a major expansion of the performing arts center completed during her tenure. AP Photo/Patrick Semansky

    President Donald Trump dismissed half the appointed trustees of the John F. Kennedy Center for the Performing Arts’ board on Feb. 12, 2025. The remaining board members, most of whom he had recently appointed, then voted to make Trump the center’s chair. The board also fired Deborah Rutter, who had served as the center’s president since 2014 and already planned to step down seven months later.

    The board replaced Rutter with Richard Grenell, who served in the first Trump administration.

    The Conversation U.S. asked E. Andrew Taylor, an arts management scholar, to explain how the Kennedy Center operates and sum up the significance of Trump’s unprecedented interference with its operations.

    Why is the government involved in the Kennedy Center?

    The Kennedy Center, a unique cultural enterprise located along the Potomac River in Washington, has a complex ownership and operating structure. The campus includes three large performance halls, two midsize theaters and many smaller venues and public spaces that host musical, theatrical and dance performances, lectures, exhibits and other special events. In form and function, it looks a lot like other major metropolitan performing arts centers, such as New York City’s Lincoln Center. But its structure is different.

    The Kennedy Center is part of the federal government. Officially, it’s a bureau under the Smithsonian Institution.

    It was originally conceived during the Eisenhower administration and later championed by President John F. Kennedy. It was named after JFK following his assassination.

    The center opened in 1971, with a world premiere of composer Leonard Bernstein’s “Mass.” President Richard M. Nixon did not attend after the FBI warned him of possible anti-war messages encoded in the Latin text that might be designed to embarrass him.

    The center’s current mission statement captures its purpose and goals:

    “As the nation’s cultural center, and a living memorial to President John F. Kennedy, we are a leader for the arts across America and around the world, reaching and connecting with artists, inspiring and educating communities. We welcome all to create, experience, learn about, and engage with the arts.”

    Why does the Kennedy Center have a nonprofit board?

    From the start, the Kennedy Center was planned as a public-private effort. Government funding covers the maintenance, upkeep, security and restoration of the building and grounds.

    Private funds, largely derived from ticket sales, individual donors, foundations and corporations, cover the performances, productions and other programs.

    Those private funds cover more than three-quarters of the Kennedy Center’s budget. Its 2023 annual report explained that its US$286 million in revenue included $152 million from ticket sales, services and fees, $85 million from donations and $45 million from the federal government, with the rest derived from income from its endowment and other sources.

    In accordance with this public-private mix of revenue, the center’s governance has always been a hybrid, with the structure of a nonprofit board but with political appointees.

    The Kennedy Center’s board is authorized by its legislation to solicit and accept private donations, enter into contracts, maintain its halls and grounds, and appoint and oversee professional leadership. For the most part, it has the same responsibilities as any nonprofit board.

    There’s a big exception, however.

    While most nonprofit boards recruit, elect and develop their own membership, the Kennedy Center board consists of government appointees. About two dozen trustees serve by virtue of their government office, such as the librarian of Congress, the secretary of state, the mayor of Washington and the speaker and the minority leader of the U.S. House of Representatives;.

    Up to 36 more are appointed by the president, each serving staggered six-year terms so that they don’t all expire at the same time.

    Singer-songwriter Sara Bareilles performs Elton John’s ‘Goodbye Yellow Brick Road’ with the National Symphony Orchestra in February 2025 at the Kennedy Center’s sold-out Concert Hall.

    Is the board supposed to be nonpartisan?

    The six-year terms reflect a goal of establishing a largely nonpartisan governing board, since presidents usually appoint board members aligned with their own party. Until now, that balance has been the norm. But that outcome wasn’t mandated when Congress passed legislation establishing the Kennedy Center.

    Having a politically balanced board has historically helped the Kennedy Center raise money and attract world-class artists. For example, the 2025 season, as of mid-February, will or has included Alvin Ailey American Dance Theater, jazz pianist Kenny Barron, soprano Renée Fleming, author David Sedaris, comedian Sarah Silverman and touring productions of “Parade” and “Les Misérables.”

    Its in-house productions are often classic works, such as “La Bohème” and Beethoven’s symphonies. Many of the center’s theatrical productions have gone on to Broadway and national tours, including “42nd Street,” “Noises Off” and revivals of “The King and I,” “Annie” and “Spamalot.”

    I’m concerned that many longtime or potential future donors may not want to contribute to a cause that has suddenly become subject to partisan leadership.

    Many artists and creative partners have already begun to sever their ties to the Kennedy Center or cancel upcoming shows at its venues out of an aversion to the board’s dramatic political turn. Some performances and tours tied to the center have been called off for other reasons that haven’t yet been made public.

    Members of the public may balk at attending events at a politically charged venue, especially with so many other performing arts options in and around Washington, reducing ticket sales.

    What does the Kennedy Center chair do?

    Board chairs are in charge of the governing board, expending considerable energy, attention, effort, political muscle and often personal wealth to ensure that the organization can thrive.

    The Kennedy Center’s prior chairs have not been figureheads. Rather, they have been actively engaged in fundraising, strategic planning and public advocacy. The legislation that chartered the center requires that its chair and secretary “shall be well qualified by experience and training to perform the duties of their respective offices.”

    Trump has admitted that he’s never seen a show at the Kennedy Center. He has no prior relevant arts board leadership experience. And he is constrained from serving on a nonprofit board in the state of New York after admitting to the misuse of charitable funds by the now-dissolved Donald J. Trump Foundation.

    David Rubenstein, the board chair ousted by this upheaval, has given the Kennedy Center at least US$111 million, making him the center’s biggest donor ever. The philanthropist spearheaded fundraising for its first major expansion, securing significant support from private corporations and foundations.

    Former Kennedy Center Chair David Rubenstein speaks at an event at the performing arts venue in 2022.
    AP Photo/Kevin Wolf

    Has anything like this happened before?

    No U.S. president has served as a member of the Kennedy Center board before, let alone its chair.

    Presidents do often appoint their friends and allies to government boards and commissions, and often remove appointees of previous administrations. President Joe Biden, for example, removed Sean Spicer – a former Trump press secretary and White House communications director – from the Naval Academy advisory board.

    But that board is leading a strictly governmental body, not a public-private hybrid so dependent on private funding. And the speed and scale of this purge are unprecedented.

    What are the potential consequences?

    All big, multi-venue metropolitan performing arts centers are extraordinarily complex and difficult to manage.

    The John F. Kennedy Center for the Performing Arts is particularly so. It hosts approximately 2,200 performances that draw more than 2 million visitors each year, with an in-house symphony and opera company. It produces the Kennedy Center Honors, which celebrate exceptional American artists with an annual gala, performance and television broadcast, and the Mark Twain Prize, which honors one accomplished American comedic actor, author or performer each year.

    The Kennedy Center hosts an annual event honoring a wide range of performers and other leaders in the arts.

    It’s also a national hub for arts education that serves 2.1 million students and teachers across all 50 states, doubling as an open campus: It offers daily free performances of everything from classical chamber music and ballet to jazz and rock bands.

    Even under the best possible conditions, this is a lot to handle.

    Successful arts nonprofits benefit from a governing board whose members have expertise in the arts, business and philanthropy, are loyal to the mission above themselves, and rigorously follow the law. Beyond those basics, ideal conditions also include having enthusiastic audiences, passionate donors, eager and exceptional artistic collaborators, and creative and administrative teams that are supported and empowered to do their difficult work.

    With Trump’s takeover of the Kennedy Center board, this national cultural center has now, essentially, turned into a branch of the White House. In my view, that’s a disturbing turn of events in a nation that celebrates free and creative expression. It’s also disruptive to a complex, mission-driven enterprise that demands care, loyalty and obedience from its governing board.

    E. Andrew Taylor directs American University’s Arts Management Program. Some of its alumni and students have worked as staff and fellows for The Kennedy Center.

    – ref. Trump has purged the Kennedy Center’s board, which in turn made him its chair – why does that matter? – https://theconversation.com/trump-has-purged-the-kennedy-centers-board-which-in-turn-made-him-its-chair-why-does-that-matter-249934

    MIL OSI – Global Reports –

    February 18, 2025
  • MIL-OSI Global: Why is water different colors in different places?

    Source: The Conversation – USA – By Courtney Di Vittorio, Assistant Professor of Engineering, Wake Forest University

    Crater Lake in Oregon looks brilliant blue because its water comes from melting snow and is extremely pure. CST Tami Beduhn, NOAA Ship Fairweather/Flickr, CC BY

    Curious Kids is a series for children of all ages. If you have a question you’d like an expert to answer, send it to curiouskidsus@theconversation.com.


    Why is water different colors in different places? – Gina T., age 12, Portland, Maine


    What do you picture when you think of water? An icy, refreshing drink? A crystal-blue ocean stretching to the horizon? A lake reflecting majestic mountains? Or a small pond that looks dark and murky?

    You would probably be more excited to swim in some of these waters than in others. And the ones that seem cleanest would probably be the most appealing. Whether or not you realize it, you are applying concepts in physics, biology and chemistry to decide whether you should leap in.

    The color of water offers information about what’s in it. As an engineer who studies water resources, I think about how I can use the color of water to help people understand how polluted lakes and beaches are, and whether they are safe for swimming and fishing.

    Light and the color of water

    Drinking water normally looks clear, but ponds, rivers and oceans are filled with floating particles. They may be tiny fragments of dirt, rock, plant material or other substances.

    These particles are often carried into the water during storms. Any rainfall that hits the ground and doesn’t go into the soil becomes runoff, flowing downhill until it reaches an open body of water and picking up loose materials along the way.

    Particles in water interact with radiation from the Sun shining on the water’s surface. The particles can either absorb this radiation or reflect it in a different direction – a process known as scattering. What we see with our eyes is the fraction of radiation that is scattered back out of the water’s surface. It strongly affects how water looks to us, including its color.

    Visible light forms just a small part of the electromagnetic spectrum, which includes all types of electromagnetic radiation. Within the visible range, different wavelengths of light produce different colors.
    Ali Damouh/Science Photo Library, via Getty Images

    Depending on the properties of the particles in our water sample, they will absorb and scatter radiation at different wavelengths. The light’s wavelength determines the color we see with our eyes.

    Waters that contain lots of sediment – such as the Missouri River, nicknamed the “Big Muddy” – backscatter light across the yellow to red range. This makes the water appear orange and muddy.

    Cleaner, more pure water backscatters light in the blue range, which makes it look blue. One famous example is Crater Lake in Oregon, which lies in a volcanic crater and is fed by rain and snow, without any streams to carry sediment into it.

    Deep waters like Crater Lake look dark blue, but shallow waters that are very clear, such as those around many Caribbean islands, can appear light blue or turquoise. This happens because light reflects off the white, sandy bottom.

    When water contains a lot of plant material, chlorophyll – a pigment plants make in their leaves – will absorb blue light and backscatter green light. This often happens in areas that contain a lot of runoff from highly developed areas, such as Lake Okeechobee in Florida. The runoff contains fertilizer from farms and lawns, which is made of nutrients that cause plant growth in the water.

    Finally, some water contains a lot of material called color-dissolved organic matter – often from decomposing organisms and plants, and also human or animal waste. This can happen in forested areas with lots of animal life, or in heavily populated areas that release wastewater into streams and rivers. This material mostly absorbs radiation and backscatters very little light across the spectrum, so it makes the water look very dark.

    Bad blooms

    Scientists expect water in nature to contains sediments, chlorophyll and organic matter. These substances help to sustain all living organisms in the water, from tiny microbes to fish that we eat. But too much of a good thing can become a problem.

    For example, when water contains a lot of nutrients and heats up on bright sunny days, plant growth in the water can get out of control. Sometimes it causes harmful algal blooms – plumes of toxic algae that can make people very sick if they swim in the water or eat fish that came from it.

    When water bodies become so polluted that they threaten fish and plants, or humans who drink the water, state and federal laws require governments to clean them up. The color of water can help guide these efforts.

    Engineering professor Courtney Di Vittorio and her students collect water samples from High Rock Lake in North Carolina to assess its water quality.

    My students and I collect water samples at High Rock Lake, a popular spot for swimming, boating and fishing in central North Carolina. Because of high chlorophyll levels, algal blooms are occurring there more often. Residents and visitors are worried that these blooms will become harmful.

    Using satellite photos of the lake and our sampling data, we can produce water quality maps. State officials use the maps to track chlorophyll levels and see how they change in space and time. This information can help them warn the public when there are algal blooms and develop new rules to make the water cleaner.


    Hello, curious kids! Do you have a question you’d like an expert to answer? Ask an adult to send your question to CuriousKidsUS@theconversation.com. Please tell us your name, age and the city where you live.

    And since curiosity has no age limit – adults, let us know what you’re wondering, too. We won’t be able to answer every question, but we will do our best.


    Courtney Di Vittorio receives funding from the North Carolina Attorney General’s Office Environmental Enhancement Grant Program (award WFU021PRE1) to collect data at High Rock Lake, NC. She is affiliated with the Yadkin Riverkeepers, an environmental advocacy not-for-profit group, and the North Carolina Lake Management Society.

    – ref. Why is water different colors in different places? – https://theconversation.com/why-is-water-different-colors-in-different-places-243895

    MIL OSI – Global Reports –

    February 18, 2025
  • MIL-OSI Global: Why do skiers sunburn so easily on the slopes? A snow scientist explains

    Source: The Conversation – USA – By Steven R. Fassnacht, Professor of Snow Hydrology, Colorado State University

    Skiers can sunburn easily for reasons that have nothing to do with the mountain’s elevation. Matt Bird/Stone via Getty Images

    It’s extremely easy to get sunburned while you’re skiing and snowboarding in the mountains, but have you ever wondered why?

    While it’s true that you’re slightly closer to the Sun when you’re high in the mountains, that isn’t the reason.

    If you go up 1 mile (1.6 km), about the elevation from Denver to the peaks of resorts such as Vail or Copper Mountain, you’re less than 1 millionth of a percent closer to the Sun – that’s nothing. Since the Earth’s orbit is an ellipse and not a circle, the planet is about 1.7% closer to the Sun in early January compared with its annual average. This means skiers get about 3.3% more Sun in January than average for the year – so, not much more.

    Being 1 mile higher up does mean the atmosphere is thinner, so there are fewer particles to block the ultraviolet radiation that causes sunburns.

    But the big reason your skin is more likely to burn has to do with all that fresh powder that skiers and snowboarders crave, especially on perfect, blue-sky days. I’m a snow scientist at Colorado State University and an avid skier. There are many ways that snow conditions affect how much your skin will burn.

    Fresh snow is very reflective

    When you’re out in the snow, a lot of the solar radiation your skin receives is reflected from the snow itself. The amount of radiation reflected is known as albedo.

    Fresh powder snow can have an albedo of almost 95%, meaning it reflects almost all of the Sun’s radiation that hits it. It’s much more reflective than older snow, which becomes less shiny. Fresh snow has a lot of surfaces to reflect the Sun’s rays. As snow ages, the snow crystal becomes more round and there are fewer surfaces to reflect light.

    Fresh snow has lots of planes to reflect the Sun’s rays, more so than older snow.
    Steven Fassnacht/Colorado State University, CC BY
    Older snow isn’t as reflective as it melts and the grains become rounder.
    Steven Fassnacht/Colorado State University, CC BY

    Having lots of fresh snow increases albedo because the Sun penetrates into the powder, reflecting off the small, newly fallen crystals. Think about starting a car after 6 inches of fresh snow fell. Some light still makes its way through the snow-covered windshield.

    Having only an inch of powder on crust is not as reflective as knee-deep fresh powder. Shallow snow is less reflective.

    What is albedo?

    A lot of people want to ski on what are known as bluebird days, when there is deep, fresh powder under a clear blue sky following a big snow dump. However, this provides the perfect conditions to burn from two directions: lots of Sun coming down from above and high albedo reflecting it back to your face from below. Clouds block sunlight, with only about one-third of the Sun’s radiation making it through a fully overcast sky.

    Which side of the mountain also matters

    Where you are on the mountain also makes a difference.

    The slope and the direction that the slope faces, called aspect, also influences the intensity of the Sun on a surface. North-facing slopes in the Northern Hemisphere get less direct sunlight in the winter, when the Sun is farther south in the sky, so they stay cooler.

    Ironton Park, near Ouray, Colo., on a clear blue day in February 2025.
    Steven Fassnacht/Colorado State University, CC BY

    A lot of the runs at Northern Hemisphere ski resorts face north, so the snow melts slower. The snow also varies from the top of the mountain to the base. There is more snow up high, and the snow melts slower there, so the albedo is higher at the top of the mountain than at the base.

    How to reduce the risk of sunburn

    To avoid sunburns, skiers and snowboarders need to take all of those characteristics into account.

    Because solar radiation is reflecting back up, people out in the snow should put sunscreen on the bottom of their noses, around their ears and on their chins, as well as the usual places.

    Most sunscreen also needs to be reapplied every two hours, particularly if you’re likely to sweat it off, wipe it off, or wear it off while playing on the slopes. However, surveys show that few people remember to do this. Wearing clothing with UV protection to cover as much skin as possible can also help.

    These methods can help protect your skin from burning and the risks of cancer and premature aging that come with it. Snow lovers need to remember that they face higher sunburn risks on the slopes than they might be accustomed to.

    Steven R. Fassnacht does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

    – ref. Why do skiers sunburn so easily on the slopes? A snow scientist explains – https://theconversation.com/why-do-skiers-sunburn-so-easily-on-the-slopes-a-snow-scientist-explains-249858

    MIL OSI – Global Reports –

    February 18, 2025
  • MIL-OSI Global: Who are Ismaili Muslims and how do their beliefs relate to the Aga Khan’s work?

    Source: The Conversation – USA – By Shariq Siddiqui, Assistant Professor of Philanthropic Studies, Indiana University

    Prince Karim Aga Khan at an event on Oct. 2, 2019, in London. Max Mumby/Indigo/Getty Images

    Prince Karim Aga Khan, who died on Feb. 4, 2025, served as the religious leader of Ismaili Muslims around the world since being appointed as the 49th hereditary imam in 1957. He came to be known around the world for his enormous work on global development issues and other philanthropic work.

    The Ismaili community considers the imam a direct descendant of the Prophet Muhammad. Ismaili Muslims are considered to be a branch of Shiite Islam. They constitute the second-largest community within the Shiite sect.

    An estimated 15 million Ismaili Muslims live in 35 countries, across all parts of the world. In the U.S., with around 40,000 Ismailis, Texas has the largest concentration of the community.

    As a scholar of Muslim philanthropy, I have long been impressed by the philanthropic and civic engagement of the Ismailis.

    Ismaili religious beliefs

    Following the death of the Prophet in A.D. 632, differences emerged over who should have both political and spiritual control over the Muslim community. A majority chose Abu Bakr, one of the Prophet’s closest companions, while a minority put their faith in his son-in-law and cousin, Ali. Those Muslims who put their faith in Abu Bakr came to be called Sunni, and those who believed in Ali came to be known as Shiite.

    Like other Shiite sects, Ismailis believe that Ali should have been selected as the successor of the Prophet Muhammad. They also believe that he should have been followed by Ali’s two sons – the grandsons of Muhammad through his daughter Fatima.

    The key difference among other Shiites and Ismailis lies in their lineage of imams. While they agree with the first six imams, Ismailis believe that Imam Ismail ibn Jafar was the rightful person to be the seventh imam, while the majority of Shiites, known as Twelvers, believe that Imam Musa al-Kazim, Ismail’s younger brother, was the true successor. They both agree that Ali was the first imam and on the next five imams, who are direct descendant of Ali and Fatima.

    The Ismaili sect split into two branches in 1094. Aga Khan was the leader of the Nizari branch, which believes in a living imam or leader. The second branch – Musta’lian Tayyibi Ismailis – believes that its 21st imam went into “concealment”; in his physical absence, a vicegerent or “da’i mutlaq” acts as an authority on his behalf.

    Like all Muslims, Ismailis believe that God sent his revelation to the Prophet Muhammad through Archangel Gabriel. However, they differ on other interpretations of the faith. According to the Ismailis, for example, the Quran conveys allegorical messages from God, and it is not the literal word of God. They also believe Muhammad to be the living embodiment of the Quran. Ismailis are strongly encouraged to pray three times a day, but it is not required.

    Ismailis believe in metaphorical, rather than literal, fasting. Ismailis believe that the esoteric meaning of fasting involves a fasting of the soul, whereby they attempt to purify the soul simply by avoiding sinful acts and doing good deeds.

    In terms of “Zakat,” or charity – the third pillar of Islam, which Muslims are required to follow – Ismailis differ in two ways. They give it to the leader of their faith, Aga Khan, and believe that they have to give 12.5% of their income versus 2.5%.

    Pluralism and its embrace

    Ismaili history has a strong connection to pluralism – part of their philosophy of embracing difference. The Fatimid Empire that ruled over parts of North Africa and the Middle East from 909 to 1171 is said to have been a “golden age of Ismaili thought.”

    It was a pluralistic community, in which Shiite and Sunni Muslims, as well as Christian and Jewish communities, worked together for the success of the flourishing empire, under the rule of the Ismaili imams.

    In the modern period, Ismailis have sought to further pluralism within their own communities by arguing that pluralism goes beyond tolerance and requires people to actively engage across differences and actively embrace difference as a strength. For example, Eboo Patel, an Ismaili American, has established the nonprofit Interfaith America as a way to further pluralism among faith communities.

    The Aga Khan’s philanthropic work

    Prince Karim Aga Khan established the Aga Khan Development Network and Aga Khan Foundation in 1967.

    Some 53 nurses and 98 midwives from Ghazanfar Institute of Health Sciences, supported by The Aga Khan University in Karachi, Pakistan, and the United States Agency for International Development, attend a graduation ceremony in Kabul, Afghanistan, on March 29, 2009.
    Massoud Hossaini AFP via Getty Images

    The network supports health care, housing, education and rural economic development in underprivileged areas. The foundation is one of nine agencies of the network that focuses on philanthropy. The Aga Khan Development Network has hospitals serving the poor in several parts of the world. The Aga Khan Medical University in Karachi, Pakistan, is considered to be a leading medical school globally.

    While previous imams or leaders also led charity and development projects, the Aga Khan was the first to create a formal, global philanthropic foundation.

    The Aga Khan Foundation operates in countries with Ismaili populations or historical connections to the Ismaili community, such as Afghanistan, Egypt, India, Kenya, Kyrgyzstan, Madagascar, Mozambique, Pakistan, Portugal, Syria, Tajikistan, Tanzania and Uganda. The foundation also has offices in Australia, Canada, the United Kingdom and the United States, focusing primarily on raising funds and advocating for the foundation.

    According to the foundation, in 2023 it served over 20 million people through 23,310 civil society partner organizations.

    The Ismaili community will now be led by the Aga Khan’s eldest son, Rahim Al-Hussaini, as the 50th imam. He has been actively involved with the Aga Khan Development Network and is expected to continue the important philanthropic and development work of his global community.

    Shariq Siddiqui does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

    – ref. Who are Ismaili Muslims and how do their beliefs relate to the Aga Khan’s work? – https://theconversation.com/who-are-ismaili-muslims-and-how-do-their-beliefs-relate-to-the-aga-khans-work-249318

    MIL OSI – Global Reports –

    February 18, 2025
  • MIL-OSI Global: Cutting funding for science can have consequences for the economy, US technological competitiveness

    Source: The Conversation – USA – By Chris Impey, University Distinguished Professor of Astronomy, University of Arizona

    National Institutes of Health indirect costs, which are under the knife, go toward managing laboratories and facilities. Fei Yang/Moment via Getty Images

    America has already lost its global competitive edge in science, and funding cuts proposed in early 2025 may further a precipitous decline.

    Proposed cuts to the federal agencies that fund scientific research could undercut America’s global competitiveness, with negative impacts on the economy and the ability to attract and train the next generation of researchers.

    I’m an astronomer, and I have been a senior administrator at the University of Arizona’s College of Science. Because of these roles, I’m invested in the future of scientific research in the United States. I’m worried funding cuts could mean a decline in the amount and quality of research published – and that some potential discoveries won’t get made.

    The endless frontier

    A substantial part of U.S. prosperity after World War II was due to the country’s investment in science and technology.

    Vannevar Bush founded the company that later became Raytheon and was the president of the Carnegie Institution. In 1945, he delivered a report to President Franklin D. Roosevelt called The Endless Frontier.

    In this report, Bush argued that scientific research was essential to the country’s economic well-being and security. His advocacy led to the founding of the National Science Foundation and science policy as we know it today. He argued that a centralized approach to science funding would efficiently distribute resources to scientists doing research at universities.

    The National Science Foundation awards funding to many research projects and early career scientists. Pictured are astronomers from the LIGO collaboration, which won a Nobel Prize.
    AP Photo/Andrew Harnik

    Since 1945, advances in science and technology have driven 85% of American economic growth. Science and innovation are the engines of prosperity, where research generates new technologies, innovations and solutions that improve the quality of life and drive economic development.

    This causal relationship, where scientific research leads to innovations and inventions that promote economic growth, is true around the world.

    The importance of basic research

    Investment in research and development has tripled since 1990, but that growth has been funded by the business sector for applied research, while federal investment in basic research has stagnated. The distinction matters, because basic research, which is purely exploratory research, has enormous downstream benefits.

    Quantum computing is a prime example. Quantum computing originated 40 years ago, based on the fundamental physics of quantum mechanics. It has matured only in the past few years to the point where quantum computers can solve some problems faster than classical computers.

    Basic research into quantum physics has allowed quantum computing to develop and advance.
    AP Photo/Ross D. Franklin

    Worldwide, basic research pays for itself and has more impact on economic growth than applied research. This is because basic research expands the shared knowledge base that innovators can draw on.

    For example, a biotech advocacy firm calculated that every dollar of funding to the National Institutes of Health generates US$2.46 in economic activity, which is why a recent cut of $9 billion to its funding is so disturbing.

    The American public also values science. In an era of declining trust in public institutions, more than 3 in 4 Americans say research investment is creating employment opportunities, and a similar percentage are confident that scientists act in the public’s best interests.

    Science superpower slipping

    By some metrics, American science is preeminent. Researchers working in America have won over 40% of the science Nobel Prizes – three times more than people from any other country. American research universities are magnets for scientific talent, and the United States spends more on research and development than any other country.

    But there is intense competition to be a science superpower, and several metrics suggest the United States is slipping. Research and development spending as a percentage of GDP has fallen from a high of 1.9% in 1964 to 0.7% in 2021. Worldwide, the United States ranked 12th for this metric in 2021, behind South Korea and European countries.

    In number of scientific researchers as a portion of the labor force, the United States ranks 10th.

    Metrics for research quality tell a similar story. In 2020, China overtook the United States in having the largest share of the top 1% most-cited papers.

    China also leads the world in the number of patents, and it has been outspending the U.S. on research in the past few decades. Switzerland and Sweden eclipse the United States in terms of science and technology innovation. This definition of innovation goes beyond research in labs and the number of scientific papers published to include improvements to outcomes in the form of new goods or new services.

    Among American educators and workers in technical fields, 3 in 4 think the United States has already lost the competition for global leadership.

    Threats to science funding

    Against this backdrop, threats made in the beginning of President Donald Trump’s second term to science funding are ominous.

    Trump’s first wave of executive orders caused chaos at science agencies as they struggled to interpret the directives. Much of the anxiety involved excising language and programs relating to diversity, equity and inclusion, or DEI.

    The National Science Foundation is particularly in the crosshairs. In late January 2025, it froze the routine review and approval of grants and new expenditures, impeding future research, and has been vetting grants to make sure they comply with orders from the U.S. president.

    The National Institutes of Health announced on Feb. 7, 2024 a decision to limit overhead rates to 15% which sent many researchers reeling though it has since been temporarily blocked by a judge. The National Institutes of Health is the world’s largest funder of biomedical research, and these indirect costs provide support for the operation and maintenance of lab facilities. They are essential for doing research.

    The new administration has proposed deeper cuts. The National Science Foundation has been told to prepare for the loss of half of its staff and two-thirds of its funding. Other federal science agencies are facing similar threats of layoffs and funding cuts.

    The impact

    Congress already failed to deliver on its 2022 commitment to increase research funding, and federal funding for science agencies is at a 25-year low.

    As the president’s proposals reach Congress for approval or negotiation, they will test the traditionally bipartisan support science has held. If Congress cuts budgets further, I believe the impact on job creation, the training of young scientists and the health of the economy will be substantial.

    Deep cuts to agencies that account for a small fraction – just over 1% – of federal spending will not put a dent in the soaring budget deficit, but they could irreparably harm one of the nation’s most valuable enterprises.

    Chris Impey has received funding from NASA, the National Science Foundation, and the Howard Hughes Medical Institute.

    – ref. Cutting funding for science can have consequences for the economy, US technological competitiveness – https://theconversation.com/cutting-funding-for-science-can-have-consequences-for-the-economy-us-technological-competitiveness-249568

    MIL OSI – Global Reports –

    February 18, 2025
  • MIL-OSI Global: The biggest threat in the Ontario election isn’t Donald Trump, it’s voter disengagement

    Source: The Conversation – Canada – By Mark Winfield, Professor, Environmental and Urban Change, York University, Canada

    Ontario Premier Doug Ford has justified his early election call on the need to respond to United States President Donald Trump’s threat to impose 25 per cent tariffs on Canadian imports.

    While the threat of tariffs on all Canadian imports has been paused — although Trump has since slapped levies on all steel and aluminum imports into the U.S. — Ontario voters need to reflect more than ever on the province’s circumstances and the performance of its government as they prepare to head to the polls next week.

    The Ford government’s approach to the environment and climate change, as well as its policies on a range of other issues like housing, health care and education, is best understood in the context of its overall “market populist” approach to governance.

    Several defining features of this model have emerged over the past six and a half years under Ford’s rule.

    Unaffordable proposals

    First, issues that require long-term perspectives on environmental, social and economic costs — like climate change — have tended to be disregarded. To the extent that the government has provided any sort of long-term vision, it has been focused on grandiose infrastructure projects.

    That includes a proposal to bury the Highway 401 highway in Toronto — an undertaking with a potential cost of anywhere between $60 and over $200 billion. But even that expense would pale in comparison to a recent proposal for a 10,000-megawatt nuclear power plant near Wesleyville, between Toronto and Kingston.

    The costs for the project based on recent experiences in the U.S., could easily top the $200 billion mark as well.

    The Ford government’s drive to “get it done” has also, at times, invoked a near-Trumpian disdain for democratic norms and limits on executive authority. This has been illustrated by, among other things, the first invocation of the notwithstanding clause of the Canadian Charter of Rights and Freedoms in Ontario history.




    Read more:
    Doug Ford uses the notwithstanding clause for political benefit


    Power has been increasingly concentrated in the premier’s office. Provisions for public participation, transparency and accountability under the guise of eliminating red tape in decision-making processes have been systemically eliminated.

    Processes for the meaningful environmental and economic review of major projects have suffered the same fate.

    Another defining issue is the Ford government’s approach to managing the province’s finances, with even the consistently pro-business Fraser Institute raising concerns.

    The disregard of financial responsibility has perhaps been most powerfully demonstrated by issuing of $200 rebates to Ontario residents. These are expected to cost to the provincial treasury more than $3 billion.

    Fewer revenue streams

    The Ford government has also displayed a willingness to eliminate billions a year in stable, long-term revenue streams, like vehicle licencing fees and fuel taxes. Major long-term costs and liabilities have been embedded at the same time, especially in relation to questionable infrastructure projects.

    All of this has taken place amid ongoing crises, attributed to provincial underfunding in areas like schools and post-secondary institutions, affordable (especially rental) housing and health care.

    In the longer term, liabilities are accumulating from the government’s failure to deal with the impacts of a changing climate.

    A final feature of the government’s market populist governance model has been an approach to decision-making based on connections, access and political whim rather than evidence or analysis.

    This pattern was perhaps most evident during the $8.3 billion Greenbelt land removal scandal involving well-connected developers. But the same pattern extends to the energy, for-profit health and resource extraction sectors as well.

    The province’s major opposition parties ran unsuccessfully in the 2022 election on the basis of platforms emphasizing adherence to what had been thought to be core principles in Ontario politics — moderation, managerial competence, and basic democratic values.

    Opposition parties

    This time, all three have turned to more populist themes.

    Liberal Leader Bonnie Crombie promises even more tax cuts than Ford. The NDP proposes to remove tolls from the 407 highway at an unknown cost to the provincial treasury and other programs.

    Even the Green Party, which has previously drawn praise for the content and imagination of its platforms, has picked up on populist themes, with an emphasis on affordability and a Ford-topping promise — and likely an even more ambitious — to build two million new homes.

    Vulnerabilities for the Ford government abound. Recent polling suggests that despite the apparently strong Conservative lead, Ford himself is deeply unpopular, particularly among women voters. Sixty per cent of Ontario residents think the province is on the “wrong track.”

    The early election call itself is widely seen as costly, unjustified and opportunistic. The distraction of the election may well have weakened the province’s immediate capacity to deal with the Trump administration.




    Read more:
    An unnecessary Ontario election won’t help Canada deal with Donald Trump


    Questions and investigations around the Greenbelt land removal scandal and the government’s relationship with the land-development industry continue to close in on the premier’s office amid an ongoing RCMP investigation.

    Crises around housing, education, health care and electricity continue to deepen.

    Ontario’s Bill 23 eliminated or weakened many housing development regulations, including site plan controls, which kept the natural environment safe from the negative effects of poorly controlled development.
    THE CANADIAN PRESS/Nathan Denette

    Still disengaged?

    In calling an early election, the Ford government has provided Ontario voters with an unexpected opportunity to reflect on its record, and the potential paths forward for the province.

    Hopefully Ontario voters will engage more deeply with these questions than they did in the 2022 election, which had the lowest voter turnout in the province’s history.

    Three years ago, the government emerged with an overwhelming majority in the legislature on the basis of the ballots of less than 18 per cent of the province’s eligible voters. The stakes are far too high in 2025 for a repeat of that level of disengagement.

    Mark Winfield receives funding from the Social Sciences and Humanities Research Council of Canada. This chapter summarizes the contents of the author’s contribution to three new volumes on Ontario politics (The Politics of Ontario, 2nd ed,( UTP 2024); Ontario Since Confederation: A Reader (UTP 2025); and Against the People (Fernwood 2025)

    – ref. The biggest threat in the Ontario election isn’t Donald Trump, it’s voter disengagement – https://theconversation.com/the-biggest-threat-in-the-ontario-election-isnt-donald-trump-its-voter-disengagement-249528

    MIL OSI – Global Reports –

    February 18, 2025
  • MIL-OSI United Kingdom: New bus route to take passengers to Ocean Retail Park and beyond

    Source: City of Portsmouth

    Portsmouth residents are set to enjoy exciting upgrades to local bus services, including a new bus route to Ocean Retail Park. These improvements, made possible through funding from the Portsmouth Bus Service Improvement Plan (BSIP), will make travel around the city more convenient, faster, and more frequent.

    A brand-new route 19 is being introduced to connect bus passengers between Anchorage Park and Leigh Park, stopping at the Airport Industrial Estate, Admiral Lord Nelson School and Ocean Retail Park. Buses will run every hour between Monday and Saturday.

    Additionally, the popular route 18 will be enhanced, extending to Clarence Pier and running every 20 minutes between Monday and Saturday, and every 30 minutes on Sunday, offering a more frequent service for passengers.

    These enhancements are part of the Portsmouth BSIP and are aimed at meeting the growing demand for faster and more frequent public transport.

    Portsmouth City Council has partnered with local bus operator, Stagecoach, to bring these much-needed changes to the city, that will take effect from 6 April 2025. The new route and improved services will support commuters, shoppers, students and visitors to QA Hospital. They will provide better connections to key destinations across Portsmouth and offer a convenient connection for those heading to the Isle of Wight via Hovertravel.

    Improving the bus service is a key part of the council’s overall plan to make travel in the city better for everyone.

    Cllr Peter Candlish, Cabinet Member for Transport, said:


    “We’re excited to further enhance Portsmouth’s bus network, making it easier and more efficient to get around the city. These changes, part of our broader plan to improve travel for all, are based on feedback from our residents and will improve transport for commuters and visitors alike. We’re committed to delivering services that meet the needs of our community.”

    Rob Vince, Business Development Manager for Stagecoach said:

    “We’re proud to partner with Portsmouth City Council to enhance bus services across Portsmouth. Through joint investment, we’re improving reliability, expanding services, and strengthening key connections to QA Hospital, Ocean Retail Park and the Isle of Wight—making travel more convenient and accessible for our communities.”

    Key improvements to bus services:

    • Service 18: Southsea • Fratton • Hilsea • QA Hospital • Paulsgrove
      Service 18 will run every 20 minutes Monday to Saturday daytime, and every 30 minutes on Sundays. Buses will extend to Clarance Pier and will now call at St Jude’s Church for Southsea Shops, offering better access to Southsea and improved connections to Hovertravel for the Isle of Wight.
    • Service 19: Leigh Park • Farlington • Burrfields • Portsmouth City Centre
      The new service 19, replacing the 21 between Anchorage Park and Leigh Park, will run every hour Monday to Saturday. The service will link Leigh Park with Farlington, the Airport Industrial Estate, Admiral Lord Nelson School, Ocean Retail Park, and Portsmouth city centre, providing faster, more direct travel for those living in and around the Leigh Park area.

    For further details on the new services visit stagecoachbus.com/promos-and-offers/south/portsmouth-changes-2025

    MIL OSI United Kingdom –

    February 18, 2025
  • MIL-OSI Russia: Congratulations on the Day of Russian Student Teams

    Translartion. Region: Russians Fedetion –

    Source: State University of Management – Official website of the State –

    This year marks the 66th anniversary of the Russian student brigade movement. And 10 years ago, by Decree of the President of Russia Vladimir Putin, an official holiday was established for the participants of student brigade groups.

    The spring of 1959 is considered to be the time when the detachments emerged, when a group of 339 students from the Lomonosov Moscow State University went to work on a construction site in the North Kazakhstan region, where virgin lands were being developed at the time. However, this date is also very conditional, since university students had been involved in agricultural work, large construction projects, and laying railways since 1920.

    In the summer of 1962, the commanders of student detachments from leading Moscow universities wrote a collective letter to the General Secretary of the CPSU Central Committee Nikita Khrushchev asking him to support their movement. He gave the go-ahead, and on November 15, 1963, the first All-Union Rally of the VSSO took place in the Kremlin Palace of Congresses, where a single Charter for all student detachments was adopted.

    Since then, the movement has acquired a truly grand scale. Student brigades participated in the development of virgin lands, the development of gas fields in Tyumen, the construction of the BAM, the Moscow metro, the VAZ and KAMAZ plants, the Sayano-Shushenskaya hydroelectric power station and other large facilities. Thanks to their activities, many settlements were founded, including the cities of Bratsk and Ust-Ilimsk. Over the years of the movement’s existence, tens of millions of students passed through it. The apogee was reached in 1982, when the one-time number of construction brigade fighters reached almost 550 thousand people.

    During their student years, the current President of Russia Vladimir Putin, the Minister of Foreign Affairs Sergey Lavrov, the Chairperson of the Federation Council Valentina Matviyenko and many other famous people had the opportunity to work in construction teams.

    Of course, this movement did not pass by the State University of Management, which in the heyday of student brigades was called the Moscow Engineering and Economic Institute. The modern campus of the university was built with the most active participation of its students. Among them were the current professor of the Department of Information Systems Vladimir Godin, professor of the Department of Project Management Alexey Lyalin, deputy chairman of the primary trade union organization of GUU employees Nikolay Nesterov, professor of the Department of Management Theory of the Institute of Public Administration and Law Alexander Raichenko and others. We talked with the latter about the history of student brigades at GUU.

    Alexander Vasilyevich, please tell us how the student work brigade movement began at our State University of Management and about your experience in them.

    — It all started for us much earlier than I started participating in it. I first came to the construction team in August 1968, after I was enrolled as a first-year student. That year, we were sent to prepare the construction site of the university complex in the garden near the metro station, which is now called Vykhino. In addition, we already had construction teams in the Moscow region and teams that were engaged in harvesting agricultural products on state farms in the Moscow and Astrakhan regions. Then, starting in 1969, we began very large-scale construction of our complex.

    Every year, 300 to 700 students worked here – this was our main construction site. Some worked not only in the summer months. In connection with this, their curriculum was redrawn, but they completed it in full. The next most important detachment was the agricultural harvesting detachment of approximately 600 people, who went to work in the Astrakhan region almost every year from 1969 to 1981.

    Where else in the country, besides Astrakhan, did our detachments work? After all, the movement is known for its all-Union construction projects.

    — Large construction teams worked in Siberia. Every year, two or three teams worked on the construction of the first line of the Baikal-Amur Mainline. We worked on the construction of the Khrebtovaya-Ust-Ilimsk branch, the settlement of Igirma. 120 of our students worked there for two years. And some time later, we worked for another two years in the settlement of Zvezdny, also on the BAM. We also had teams in the Gorno-Altai Region. In 1969, there were about eight teams there, from each faculty. And in the Uzhur District of the Krasnoyarsk Territory, in the settlement of Shchetinkino, they were building a large residential complex. There were also some rather exotic places to work. One of the teams worked on industrial and civil construction in the settlement of Mirny, in Yakutia, the diamond capital of Russia. This was an unexpected appointment for us, but our students showed themselves well there.

    What practical benefits did these works provide to students?

    — The experience that students gained in construction teams was very helpful. I know more than 30 current managers who gained their first experience in production activities in student teams. Today they hold respectable positions, from the head of the construction and installation department to the governor of the region.

    And who from the current faculty of the State University of Management used to work in construction teams?

    — I know more than 20 people working at the university today who had such experience. The thing is that this work was considered as industrial practice. Rector of MIEI Olimpiada Vasilyevna Kozlova defined this activity as the first immersive industrial practice. It was not industry-specific, but it provided real and useful experience. Almost 100% of students, with the exception of those who could not participate in the work due to physical condition, were involved in one or another detachment. And the most active did this throughout the entire period of study. That is, every year, starting in May, when our quartermasters left, and ending in October, when the final results were summed up and we settled accounts with our customers, they actively participated in this work.

    We have an archive photo of MIE students in Czechoslovakia. Did our guys go anywhere else abroad?

    — What you are talking about was an interesting practice, it was called “currency-free exchange”. Student teams from our university were sent to four countries: the German Democratic Republic, Czechoslovakia (Charles University was a major partner of ours), Bulgaria (we had strong and long-term ties with it, our teams went there almost every year), and there were also ties with the Polish People’s Republic, although to a lesser extent. The same number of students from the universities with which we cooperated came from these countries. They worked for us, as a rule, on the construction of buildings for our university. Our students abroad worked at various sites, on construction sites of the national economy and the like.

    Today, RSO is 400 thousand young people from 85 regions of Russia who cooperate with more than 1000 employers, including Russian Railways, Rosatom, Gazprom, EkoNiva, Artek and other large organizations. Thus, students not only gain practical skills in professional activities, but also help solve important economic problems, form the country’s personnel reserve.

    “This is a unique school of life that shapes not only professional and personal qualities, but also the desire to live and develop in the native country. We are proud that the guys are becoming part of a big cause – strengthening the economy and social sphere of Russia. The contribution of the student brigades is an investment in the future of our country,” said the head of the Federal Agency for Youth Affairs (Rosmolodezh), associate professor of the Department of State and Municipal Administration of the State University of Management Grigory Gurov.

    Let us recall that at the end of last year, the State University of Management signed a cooperation agreement with the RSO and this spring will begin active joint work in the area of pedagogical and educational activities, as well as the work of service departments.

    We congratulate everyone involved in the movement on the holiday! We wish you success in work and study, as well as a lot of pleasant impressions from business trips and communication with new acquaintances.

    Subscribe to the TG channel “Our GUU” Date of publication: 02/17/2025

    Please note: This information is raw content directly from the source of the information. It is exactly what the source states and does not reflect the position of MIL-OSI or its clients.

    MIL OSI Russia News –

    February 18, 2025
  • MIL-OSI Europe: ASIA/MYANMAR – Funeral ceremony in the birthplace: ten suspects arrested in connection with the murder of Father Donald Martin Ye Naing Win

    Source: Agenzia Fides – MIL OSI

    Monday, 17 February 2025

    Archdiocese of Mandalay

    Yangon (Agenzia Fides) – More than 5,000 people, despite the dangers and general violence, gathered in the village of Pyin Oo Lwin to pay their last respects to Catholic priest Donald Martin Ye Naing Win, who was brutally murdered on February 14 in his parish of Our Lady of Lourdes in the Archdiocese of Mandalay (see Fides, 15/2/2025). The mountain village of Pyin Oo Lwin is the birthplace of Father Donald, where his family lives. There, priests, religious, faithful gathered around the Archbishop of Mandalay, Marco Tin Win, in the Catholic Church of the Assumption of the Virgin Mary to celebrate the funeral mass and offer consolation to Father Donald’s family, who attended the funeral mass. The moving participation of the people, according to Fides sources present at the celebration, set the scene for the Mass during which the Archbishop read the message of the Apostolic Nunciature in Yangon and the condolences of the Bishops’ Conference of Myanmar, which express deep and sincere solidarity with the local population (see Fides, 17/2/2025).Archbishop Marco Tin Win, who presided over the Eucharist, urged the faithful to wake up, “because violence only brings death and destruction, it is always a defeat”, and he made a heartfelt appeal “to all armed groups and actors involved in the conflict to lay down their weapons and take a path of peace and reconciliation”. He then entrusted Father Donald, his family and the entire community present to the loving hands of the Virgin Mary: “May Our Lady accompany him to paradise and protect all under her mantle, giving comfort and hope,” said the Archbishop.The local community is asking about the reasons for the senseless murder of a priest who devoted himself with ardour to others. According to local sources, Father Donald was particularly involved in organizing educational work for children and young people in the area around his parish of Our Lady of Lourdes, where he was the first parish priest and where about 40 Catholic families live. Faced with civil war, violence and displacement, schools are closed, there are no teachers and only informal classes given voluntarily by priests, religious and catechists ensure a minimum level of continuity in the education of children and young people.The area is controlled by the People’s Defence Force (PDF), which is fighting against the military junta. The leadership of these forces has been asked to investigate the armed groups that attacked and murdered the priest. The militias, meanwhile, have arrested ten men from the village of Kan Gyi Taw, where Father Donald was murdered. The People’s Defense Forces, according to Fides sources, are themselves interested in identifying and punishing the culprits and have transferred those arrested to a court set up by the People’s Defense Force in the areas currently defined as “liberated areas”, that is, not under the control of the Burmese government. (PA) (Agenzia Fides, 17/2/2025)
    Share:

    MIL OSI Europe News –

    February 18, 2025
  • MIL-OSI Africa: Why is there so much gold in west Africa?

    Source: The Conversation – Africa – By Raymond Kazapoe, Senior lecturer, University for Development Studies

    Militaries that have taken power in Africa’s Sahel region – notably Mali, Burkina Faso and Niger – have put pressure on western mining firms for a fairer distribution of revenue from the lucrative mining sector.

    Gold is one of the resources at the heart of these tensions. West Africa has been a renowned gold mining hub for centuries, dating back to the ancient Ghana empire, which earned its reputation as the “Land of Gold” because of its abundant reserves and thriving trade networks. The region remains a global leader in gold production. As of 2024, west Africa contributed approximately 10.8% of the world’s total gold output.

    But why is there so much gold in this region? The Conversation Africa asked geologist Raymond Kazapoe to explain.

    How is gold formed?

    The simple answer here is that we are not certain. However, scientists have some ideas.

    Gold, like all elements, formed through high energy reactions that occurred in various cosmic and space environments some 13 billion years ago, when the universe started to form.

    However, gold deposits – or the concentration of gold in large volumes within rock formations – are believed to occur through various processes, explained by two theories.

    The first theory – described by geologist Richard J. Goldfarb – argues that large amounts of gold were deposited in certain areas when continents were expanding and changing shape, around 3 billion years ago. This happened when smaller landmasses, or islands, collided and stuck to larger continents, a process called accretionary tectonics. During these collisions, mineral-rich fluids moved through the Earth’s crust, depositing gold in certain areas.

    A quartz vein rock specimen with visible gold. Mangiwau/Getty Images

    A newer, complementary theory by planetary scientist Andrew Tomkins explains the formation of some much younger gold deposits during the Phanerozoic period (approximately 650 million years ago). It suggests that as the Earth’s oceans became richer in oxygen during the Phanerozoic period, gold got trapped within another mineral known as pyrite (often called fool’s gold) as microscopic particles. Later, geological processes – like continental growth (accretion) and heat or pressure changes (metamorphism) released this gold – forming deposits that could be mined.

    Where in west Africa is gold found and what are its sources?

    Most gold production and reserves in west Africa are found within the west African craton. This is one of the world’s oldest geological formations, consisting of ancient, continental crust that has remained largely unchanged for billions of years.

    West African Craton. Wikipedia

    The craton underlies much of west Africa, spanning parts of Mali, Ghana, Burkina Faso, Côte d’Ivoire, Guinea, Senegal and Mauritania. In fact, most west African countries that have significant gold deposits have close to 50% of their landmass on the craton. Notably, between 35% and 45% of Ghana, Mali and Côte d’Ivoire’s territory sits on it – which is why these areas receive so much attention from gold prospectors.

    Gold deposits were formed within west Africa’s craton rocks during a major tectonic event, known as the Eburnean Orogeny, 2.2 billion to 2.08 billion years ago. This event was accompanied by the temperature, pressure and tectonic conditions which promote gold mineralisation events. Most of the gold resources in the west African craton are found within ancient geological formations formed by volcanic and tectonic processes about 2.3 billion to 2.05 billion years ago. These are known as the Rhyacian Birimian granitoid-greenstone belts.

    These gold-bearing belts in Ghana and Mali are by far the most endowed when compared with other countries in the region. Ghana and Mali currently, cumulatively account for over 57% of the combined past production and resources of the entire west Africa sub-region.

    Gold bearing geological structures in Ghana. Gerhard Michael Free/Shutterstock

    Ghana is thought to be home to 1,000 metric tonnes of gold. The country produces 90 metric tonnes each year – or 7% of global production. Gold production in Mali reached around 67.7 tonnes in 2023. Mali has an estimated 800 tons of gold deposits.

    By comparison, the world’s two largest gold producers are China (which mined approximately 370 metric tonnes of gold in 2023) and Australia (which had an output of around 310 metric tonnes in 2023).

    What are some of the modern exploration tools used to find gold?

    Gold was traditionally found by panning in riverbeds, where miners swirled sediment in water to separate the heavy gold particles, or by digging shallow pits to extract gold-rich ores. Over time, methods have evolved to include geochemical exploration techniques, advanced geophysical surveys, and chemical extraction techniques, like cyanide leaching.

    Geological mapping techniques are always evolving, and at the moment, there is a lot of interest in combining remote sensing data with cutting-edge data analytics methods, like machine learning. By combining these two methods, geologists can get around some of the problems caused by traditional methods, like the reliance on subjective judgement to create reliable maps and the need to spend money prospecting in areas with low chances of success.

    In recent years, deep learning computer techniques have made significant progress. They examine various geological data-sets to reduce uncertainty and increase the chances of finding gold mineralisation through advanced artificial intelligence techniques. These methods have proved highly beneficial in identifying specific features and discovering new mineral deposits when applied to remote sensing data.

    Another method, which I’ve researched and which could serve as a complementary gold exploration tool, is the use of stable isotopes. Stable isotopes are elements – like carbon, hydrogen and oxygen – that do not decay over time. Some are responsible for helping to carry gold, in fluids, through rocks to form the deposits. As the gold-bearing fluids interact with the rocks, they transfer the stable isotopes to the rocks, thereby imbuing them with their unique signature. The thinking here is to identify the signature and then use it as a proxy for finding gold, since gold itself is hard to identify directly.

    Advancements in analytical techniques have reduced the cost, volume, and time involved. This makes it a viable alternative to geochemical approaches – the most widely used and relatively efficient method.

    – Why is there so much gold in west Africa?
    – https://theconversation.com/why-is-there-so-much-gold-in-west-africa-248599

    MIL OSI Africa –

    February 18, 2025
  • MIL-OSI United Kingdom: New funding to help create the next generation of aviators and boost the economy

    Source: United Kingdom – Executive Government & Departments

    Latest round of Reach for the Sky programme awarded £810,000 to 16 organisations across the UK.

    • £810,000 of new government funding to help young people start a career in aviation by breaking down barriers to opportunity
    • with the air transport and aerospace sector contributing £20 billion to the UK economy, investment in the next generation of professionals will secure long-term economic growth and deliver on the government’s Plan for Change
    • Reach for the Sky scheme has now provided £2.3 million to 37 organisations, reaching 100,000 people across the country, from Cornwall to Carlisle

    The Aviation Minister has today (17 February 2025) launched the latest round of funding to encourage more young people into a career in aviation, helping to secure long term economic growth and ensuring the sector has the workforce needed for the future.

    Now in its third round, the government’s Reach for the Sky programme will see £810,000 awarded to 16 organisations across the UK, from Cornwall to Newcastle.

    The successful scheme, which totals £2.3 million, has now delivered funding to 37 outreach organisations and reached 100,000 people across the country.

    Supporting young people to pursue careers such as pilots, navigators and controllers also aligns with the government’s ambition to go further and faster to kickstart growth. As part of the drive to build up aviation capacity at Heathrow and across the sector – from increased travel options to more UK homegrown aviation jobs – expansion in the sector plays a crucial part in unlocking economic prosperity.

    Reach for the Sky aims to break down barriers to opportunity and form the next generation of aviators, particularly by supporting young people from disadvantaged backgrounds who may not have considered a career in the sector before.

    Funding will help organisations deliver events, interactive workshops, taster days, mentorship schemes and educational initiatives with schools, universities and career professionals.

    Aviation Minister, Mike Kane, said:

    As part of our Plan for Change, we are breaking down barriers to opportunity so that every young person has the chance to pursue their dreams.  

    Programmes like Reach for the Sky turn ambition into reality, helping to inspire young people and introducing them to the benefits of a career in the skies.  

    I look forward to seeing the achievements of the next generation of aviators.

    With Office for National Statistics (ONS) data showing that young people from disadvantaged households are more likely to feel they do not have as much of a chance in life, programmes like Reach for the Sky help break down barriers to opportunity and expand horizons for underserved, hard-to-reach groups.

    This year’s recipients of the DfT-funded scheme include SaxonAir, The King’s Trust and Employers and Educators, amongst others.

    SaxonAir, who have been successful in previous rounds, offer a range of scholarships, volunteering programmes and events for people of all backgrounds.

    One of their main initiatives is the INSPIRE programme, delivered in partnership with Business In The Community (BITC) at West Earlham Infant School. It aims to make the aviation industry inclusive for individuals of all ages, abilities, and backgrounds.

    The initiative is already making a tangible difference, with teachers at West Earlham Infant School in Norwich reporting a surge in enthusiasm for aviation among pupils following a recent visit.

    Hannah Colledge, HR and Wellbeing Coordinator at SaxonAir, said: 

    Our INSPIRE Outreach Programme is designed to spark a passion for aviation from as young as 5 years old offering tailored activities that align with different age groups and connect appropriately to the curriculum.

    With support from the Reach for the Sky funding, we can extend our reach, ensuring that young people from all backgrounds have the chance to experience aviation firsthand.

    By breaking down barriers and bringing aviation opportunities to underrepresented communities, we are reinforcing our commitment to a more diverse and inclusive aviation sector.

    Graham, the father of a student at Aylsham High School, Norwich, said:

    [My son] really enjoyed the INSPIRE event and loved the opportunity to see what goes on behind the scenes in the aviation industry. His ambition is to be a pilot, but this event opened his eyes into other possibilities of work with and around aircraft. Thank you for providing him with this rare opportunity.

    Education and Employers Charity helps young people discover their future by bringing inspiration from the world of work into school. Reach for the Sky funding helps them connect aviation professionals with young people to deliver careers events and provide training across the UK.

    Speaking about one of these events, a pupil at Ealing Fields High School, Josh from London said:

    I’ve wanted to be a pilot for a long time and the opportunity to listen to a pilot tell his story and career path was really impactful. At the end I was lucky enough to speak to him 1:1 and this really helped me with my questions. Since meeting with him I’ve made the most of opportunities and even visited a flight simulator. The talk was so impactful.

    The Civil Aviation Authority (CAA) is responsible for delivering the Reach for the Sky programme on behalf of DfT.

    Sophie Jones, Head of Organisational Capability and STEM Sponsor at the CAA, said:

    The aerospace sector provides many jobs and opportunities for development, and with the innovation and growth currently taking place, it is all the more vital for young people to join the industry.

    The Reach for the Sky Challenge fund provides support for outreach programmes that inspire the next generation, from all backgrounds, to pursue careers in aviation and aerospace, ensuring that the UK continues to be at the forefront of innovation and development.

    As the UK’s aviation regulator, we are proud to inspire the next generation’s journey into this fantastic industry through our STEM programme, funded by the Department for Transport.

    Aviation, Europe and technology media enquiries

    Media enquiries 0300 7777 878

    Switchboard 0300 330 3000

    Share this page

    The following links open in a new tab

    • Share on Facebook (opens in new tab)
    • Share on Twitter (opens in new tab)

    Updates to this page

    Published 17 February 2025

    MIL OSI United Kingdom –

    February 18, 2025
  • MIL-OSI Global: Amish voters for Trump? The Amish and the religion factor in Republican electoral politics

    Source: The Conversation – France – By Daniele Curci, PhD Candidate in International and American History, University of Florence

    On November 5, 2024, as millions of Americans headed to the polls, billionaire Elon Musk posted a video on his social media platform X depicting a caravan of Amish individuals travelling via horse and buggy to vote for Donald Trump. The following day, in response to a post expressing gratitude to the Amish for their contribution to Trump’s victory, Musk wrote: “The Amish may very well save America! Thank goodness for them. And let’s keep the government out of their lives.” Musk’s tweets underscore the growing prominence of religion in US politics and the Republican party’s efforts to integrate the Amish into its electorate.

    The Amish and their vote in US history

    The Amish are a Protestant religious community rooted in early European Anabaptist movements. They accept technological advancements selectively, adhering to a distinct way of life marked by simple living, plain dress and a focus on community, distinguishing between what strengthens their social bonds and what might compromise their spiritual path. The Amish are a tiny minority in the US: in 2022, there were approximately 373,620 individuals in a population of around 330 million–slightly more than one in 1,000 Americans. They are predominantly concentrated in the election swing states of Pennsylvania and Ohio, which partly explains Republicans’ interest in courting their support.

    Traditionally, the Amish mainly abstain from voting unless they feel compelled to protect their religious freedoms, preserve their way of life or address critical moral issues. Historically, such instances of electoral participation have occurred only three times.

    The first instance dates back to the 1896 presidential election, when the Republican nominee, William McKinley, campaigned on a platform centred on industrial corporate interests. These interests diverged significantly from those of the Amish, who aligned instead with Democrat William Bryan’s policies advocating for small farmers and the defense of rural America.

    Amish political engagement resurfaced during the 1960 presidential election, which featured Republican Richard Nixon vs Democrat John F. Kennedy. The Amish viewed Kennedy as an ally of the Catholic church, an institution they viewed as intolerant. Consequently, they supported Nixon, a Quaker, whom they saw as a defender of a Protestant America.

    The most recent instances of notable Amish participation occurred amid the presidential election campaigns of Republican George W. Bush in 2000 and 2004. This phenomenon, dubbed “Bush Fever,” saw unprecedented Amish voter turnout. In 2000, 1,342 out of 2,134 registered Amish voters in Lancaster County, Pennsylvania–which has one of the largest Amish communities in the US–cast ballots, achieving a turnout rate of 63%. By 2004, Amish voter registration had increased by 169%, with 21% of eligible adults being registered. This mobilization was spearheaded by Chet Beiler, the son of Amish parents who left the community when he was three. Leveraging his heritage and fluency in Pennsylvania German, a traditional language spoken in many Amish communities, Beiler developed a voter registration strategy targeting the Amish to support Bush’s re-election campaign.

    The religious factor in US politics

    To understand the Republican party’s interest in the Amish, one must examine the increasing centrality of religion in US politics. This phenomenon persists despite a growing number of Americans identifying as non-religious or less religious.

    In the US political context, religion extends beyond faith to encompass cultural identity and social cohesion. Scholars often describe this phenomenon as “Christianism,” a form of nationalism that is bound together by a belonging to Christianity and that emerges, as a form of reaction, within the culture wars. Consequently, a political platform emphasizing Christian principles and rural values has the potential to galvanize segments of the electorate. This dynamic is exemplified by Musk’s tweets about the Amish. Within some parts of the Republican electorate, the Amish are perceived as “guardians of lost values,” embodying a vision of an untainted rural America defined by traditional family structures and an agrarian work ethic. This narrative has been further amplified by Amish PAC, a political action committee established in Virginia in 2016 to rally support for Trump through religiously framed identity politics that advocate for traditional values and oppose abortion rights.

    The influence of religion within the Republican party is further underscored by the ascendancy of the Christian right, a political movement that emerged in the late 1970s. Though not a monolithic entity, it is composed of individuals–primarily evangelical Christians–seeking to shape US politics based on a conservative interpretation of biblical principles and societal values.

    Legislation and the Amish

    Some Republicans have advocated for legislation favourable to the Amish, such as former US representative Bob Gibbs, who won election in the Amish-dominated congressional district of Holmes County, Ohio. In December 2021, Gibbs introduced legislation to allow people with specific religious beliefs such as the Amish, who view photography as a form of idolatry, to be exempt from a requirement of possessing identification documents featuring their photographs “to purchase a firearm from a federally licensed firearms dealer.” In the same month, Gibbs also proposed another bill to benefit the Amish, which would have allowed them to opt out of social security and Medicare wage deductions if they were employed by non-Amish-owned companies.

    Earlier in 2021, the conservative-majority Supreme Court resolved a longstanding dispute between the Amish of Lenawee County, Michigan and local authorities, ruling in favour of the Amish. The issue at the heart of the case concerned wastewater management. Following their religious principles, the Amish typically avoid using modern inventions such as septic systems, and the Amish in Lenawee County used a management method considered noncompliant by health officials. This case followed similar ones involving other Amish communities in Ohio, Minnesota and Pennsylvania. Legal disputes such as these could be leading the Amish to form a more positive view of the Republican party and Trump, both for their advocacy of “less government” and for positioning themselves as defenders of religious freedom.

    The Amish and the 2024 presidential election

    According to the online news source Anabaptist World, media reports suggested that the 2024 presidential election saw a surge in voter registrations among the Amish in Pennsylvania, allegedly contributing to Trump’s victory in the state. The alleged surge was reportedly driven by a reaction to federal legal actions against an Amish farmer accused of selling raw dairy products across state lines, which resulted in cases of Escherichia (E.) coli.

    However, official data from Lancaster County–where the principal Amish settlement in Pennsylvania is located–challenge claims of a massive Amish turnout. The increase in Trump’s vote share in the state, from 48.84% in 2020 to 50.37% in 2024, primarily occurred in urban and suburban areas. For example, by the time the Associated Press declared that Trump had won Pennsylvania, his vote share in Philadelphia had improved by three percentage points. Key suburban counties such as Bucks, Monroe and Northampton, which former president Joe Biden won in 2020, had swung in his favour. And the Republican had also performed better in the Philadelphia-area suburbs of Delaware and Chester counties. These regions, with few Amish residents, experienced substantial shifts, while districts with larger Amish populations saw only modest gains for Trump.

    While the Amish did not become a significant component of Trump’s electoral coalition, voters in some Amish communities may have grown more sympathetic to his candidacy. More importantly, members of the religious group serve as a potent symbol of mobilization and propaganda for the Republican party amid the intensifying polarization of US politics.

    Daniele Curci ne travaille pas, ne conseille pas, ne possède pas de parts, ne reçoit pas de fonds d’une organisation qui pourrait tirer profit de cet article, et n’a déclaré aucune autre affiliation que son organisme de recherche.

    – ref. Amish voters for Trump? The Amish and the religion factor in Republican electoral politics – https://theconversation.com/amish-voters-for-trump-the-amish-and-the-religion-factor-in-republican-electoral-politics-247869

    MIL OSI – Global Reports –

    February 18, 2025
  • MIL-OSI Global: Why is there so much gold in west Africa?

    Source: The Conversation – Africa – By Raymond Kazapoe, Senior lecturer, University for Development Studies

    Militaries that have taken power in Africa’s Sahel region – notably Mali, Burkina Faso and Niger – have put pressure on western mining firms for a fairer distribution of revenue from the lucrative mining sector.

    Gold is one of the resources at the heart of these tensions. West Africa has been a renowned gold mining hub for centuries, dating back to the ancient Ghana empire, which earned its reputation as the “Land of Gold” because of its abundant reserves and thriving trade networks. The region remains a global leader in gold production. As of 2024, west Africa contributed approximately 10.8% of the world’s total gold output.

    But why is there so much gold in this region? The Conversation Africa asked geologist Raymond Kazapoe to explain.

    How is gold formed?

    The simple answer here is that we are not certain. However, scientists have some ideas.

    Gold, like all elements, formed through high energy reactions that occurred in various cosmic and space environments some 13 billion years ago, when the universe started to form.

    However, gold deposits – or the concentration of gold in large volumes within rock formations – are believed to occur through various processes, explained by two theories.

    The first theory – described by geologist Richard J. Goldfarb – argues that large amounts of gold were deposited in certain areas when continents were expanding and changing shape, around 3 billion years ago. This happened when smaller landmasses, or islands, collided and stuck to larger continents, a process called accretionary tectonics. During these collisions, mineral-rich fluids moved through the Earth’s crust, depositing gold in certain areas.

    A newer, complementary theory by planetary scientist Andrew Tomkins explains the formation of some much younger gold deposits during the Phanerozoic period (approximately 650 million years ago). It suggests that as the Earth’s oceans became richer in oxygen during the Phanerozoic period, gold got trapped within another mineral known as pyrite (often called fool’s gold) as microscopic particles. Later, geological processes – like continental growth (accretion) and heat or pressure changes (metamorphism) released this gold – forming deposits that could be mined.

    Where in west Africa is gold found and what are its sources?

    Most gold production and reserves in west Africa are found within the west African craton. This is one of the world’s oldest geological formations, consisting of ancient, continental crust that has remained largely unchanged for billions of years.

    The craton underlies much of west Africa, spanning parts of Mali, Ghana, Burkina Faso, Côte d’Ivoire, Guinea, Senegal and Mauritania. In fact, most west African countries that have significant gold deposits have close to 50% of their landmass on the craton. Notably, between 35% and 45% of Ghana, Mali and Côte d’Ivoire’s territory sits on it – which is why these areas receive so much attention from gold prospectors.

    Gold deposits were formed within west Africa’s craton rocks during a major tectonic event, known as the Eburnean Orogeny, 2.2 billion to 2.08 billion years ago. This event was accompanied by the temperature, pressure and tectonic conditions which promote gold mineralisation events. Most of the gold resources in the west African craton are found within ancient geological formations formed by volcanic and tectonic processes about 2.3 billion to 2.05 billion years ago. These are known as the Rhyacian Birimian granitoid-greenstone belts.

    These gold-bearing belts in Ghana and Mali are by far the most endowed when compared with other countries in the region. Ghana and Mali currently, cumulatively account for over 57% of the combined past production and resources of the entire west Africa sub-region.

    Ghana is thought to be home to 1,000 metric tonnes of gold. The country produces 90 metric tonnes each year – or 7% of global production. Gold production in Mali reached around 67.7 tonnes in 2023. Mali has an estimated 800 tons of gold deposits.

    By comparison, the world’s two largest gold producers are China (which mined approximately 370 metric tonnes of gold in 2023) and Australia (which had an output of around 310 metric tonnes in 2023).

    What are some of the modern exploration tools used to find gold?

    Gold was traditionally found by panning in riverbeds, where miners swirled sediment in water to separate the heavy gold particles, or by digging shallow pits to extract gold-rich ores. Over time, methods have evolved to include geochemical exploration techniques, advanced geophysical surveys, and chemical extraction techniques, like cyanide leaching.

    Geological mapping techniques are always evolving, and at the moment, there is a lot of interest in combining remote sensing data with cutting-edge data analytics methods, like machine learning. By combining these two methods, geologists can get around some of the problems caused by traditional methods, like the reliance on subjective judgement to create reliable maps and the need to spend money prospecting in areas with low chances of success.

    In recent years, deep learning computer techniques have made significant progress. They examine various geological data-sets to reduce uncertainty and increase the chances of finding gold mineralisation through advanced artificial intelligence techniques. These methods have proved highly beneficial in identifying specific features and discovering new mineral deposits when applied to remote sensing data.

    Another method, which I’ve researched and which could serve as a complementary gold exploration tool, is the use of stable isotopes. Stable isotopes are elements – like carbon, hydrogen and oxygen – that do not decay over time. Some are responsible for helping to carry gold, in fluids, through rocks to form the deposits. As the gold-bearing fluids interact with the rocks, they transfer the stable isotopes to the rocks, thereby imbuing them with their unique signature. The thinking here is to identify the signature and then use it as a proxy for finding gold, since gold itself is hard to identify directly.

    Advancements in analytical techniques have reduced the cost, volume, and time involved. This makes it a viable alternative to geochemical approaches – the most widely used and relatively efficient method.

    Raymond Kazapoe receives funding from the African Union and Pan African University to carry out some of the research referenced in this article

    – ref. Why is there so much gold in west Africa? – https://theconversation.com/why-is-there-so-much-gold-in-west-africa-248599

    MIL OSI – Global Reports –

    February 18, 2025
  • MIL-OSI Global: How Thailand’s TV lesbian romances captured a global audience

    Source: The Conversation – UK – By Eva Cheuk-Yin Li, Lecturer in Sociology (Media & Cultural Studies Team), Lancaster University

    While dramas about male same-sex romance (known as “boys’ love”, or BL) have been popular in Asia since 2010, “girls’ love” (GL) dramas are only now seeing a meteoric rise in popularity – and they are coming out of Thailand.

    On January 23 2025, Thailand became the first country in south-east Asia to legalise same-sex marriage. Although the country is often imagined as a “gay paradise”, Thai society remains largely conservative and homophobia is still commonplace. Against this social backdrop, the rise of LGBTQ+ storytelling is intriguing – perhaps revealing the emergence of more tolerant and progressive attitudes.

    In Thailand, these BL and GL dramas are known as series “Y”, an industry estimated to be worth 3 billion baht (approximately £72 million) in 2024. Thailand’s GL dramas now reshaping sapphic storytelling and bringing it to the mainstream.

    Besides the central romance plotline, GL stories often explore pertinent issues such as family expectations and societal pressure, coming-out struggles, and age and class differences. Adding depth to the narrative, these issues chime with young queer audiences seeking more realistic, relatable experiences.


    Looking for something good? Cut through the noise with a carefully curated selection of the latest releases, live events and exhibitions, straight to your inbox every fortnight, on Fridays. Sign up here.


    A hub for BL series since the mid-2010s, Thailand only produced its first full-length GL series in 2022. Despite investor doubts, the producer of a then-small production house financed a pioneering series called Gap, telling the story of an office romance between a royal-descendant CEO and a junior member of staff.

    Airing on domestic TV and later uncut on YouTube with multilingual subtitles, Gap amassed over 850 million views by January 2025, proving a global appetite for queer women-oriented stories. By February 2025, more than 20 GL series had aired, with at least 30 more in production.

    Trailer of Gap (2022), Thailand’s first full-length GL series.

    Series like Blank, 23.5, The Secret of Us, Affair, and The Loyal Pin illustrate the genre’s growing popularity, with uncut versions available on platforms like YouTube and Netflix, complete with subtitles in various languages such as English, Korean, Vietnamese, Spanish, Portuguese and Turkish.

    Thailand’s GL dramas have adopted successful practices from their BL counterparts: adapting novels, scouting and training actors, incorporating product placement, hosting fan events and appearing on variety shows. One notable practice is the making of khu-jin (imagined couple), where celebrities perform same-sex intimate moments on stage or social media to serve fans’ fantasies.

    “Shipping” culture – the practice of imagining or supporting a romantic relationship between fictional or real individuals – is pivotal to GL’s success. The two Gap leads, Freen Sarocha and Becky Armstrong have created the “FreenBecky” ship, and each have more than four million Instagram followers. Actresses of other “ships” such as LingOrm, EngLot, and FayeYoko, command similarly devoted followings. Their fan meetings across Asia regularly draw tens of thousands, blending fiction and reality to create an immersive fan ecosystem.

    Celebrating Girls Love

    As we discussed in our recent research, Thai GL series also emphasises joy and resilience, unlike the tragic endings often seen in western LGBTQ+ narratives. US-produced content has been criticised for the “bury your gays trope”, where LGBTQ+ characters are frequently killed off in tragic or unnecessary ways.

    Another objection is “dead lesbian syndrome”, where lesbian and bisexual characters are even more likely to be killed on screen. Notorious examples include Killing Eve and The 100.

    In contrast, Thai GL stories celebrate love and acceptance, despite the challenges experienced by protagonists. Series like Gap, The Secret of Us, and Mate feature grand wedding finales with the blessing of parents and friends, portraying queer love overcoming obstacles and thriving.

    GL series also speak directly to the queer women’s community. Many actresses, such as Engfa Waraha in Show Me Love and Petrichor, and Faye Malisorn in Blank, are openly queer or vocal queer allies.

    Although many GL series have male directors, love scenes are respectful, focusing on sensuality and desire rather than being graphic and exploitative. This contrasts with films such as Blue is the Warmest Colour, in which love scenes were criticised as being exploitative, and where actresses have reported problematic practices during filming.

    Opportunities and challenges

    From their inception, Thai GL dramas have aired locally but have quickly been made available on streaming platforms with multilingual subtitles for a global audience. Social media platforms amplify their reach, with production houses curating trends and fostering interactive fan experiences.

    Recognising the potential for cultural export, the Thai government has partnered with BL and GL production companies to promote Thai culture and products. It is unusual for governments to embrace queer culture as a vehicle for soft power, which highlights the growing cultural and economic significance of these series. Though this development has sparked concerns over the intentions behind such support, it signals a future where queer narratives hold global, cultural and political relevance.

    Despite its success, GL entertainment faces challenges. Many series are still adaptations of novels, limiting thematic diversity. While themes like schoolyard dramas and sweet romances such as Love Senior, Unlock Your Love, and Us prevail, some series are pushing boundaries with themes like disability (Pluto), supernatural power (Reverse 4 You), and crime (Petrichor).

    GL romances provide a vital space for queer women’s stories, connecting audiences across borders through global visibility and fan culture. Most remarkably, this shift isn’t coming from Hollywood.

    As the genre evolves, it holds the potential to continue redefining representation and amplifying underrepresented voices. It’s not just reshaping how queer women’s stories are told and viewed globally, it’s proving to be commercially viable and culturally transformative.

    In the face of rising global reactionary politics and growing hatred against the LGBTQ+ community following Trump’s re-election, Thai GL series offers not only a safe escape and fantasy, but also a sense of solidarity through their worldwide fandom.

    The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.

    – ref. How Thailand’s TV lesbian romances captured a global audience – https://theconversation.com/how-thailands-tv-lesbian-romances-captured-a-global-audience-248261

    MIL OSI – Global Reports –

    February 18, 2025
  • MIL-OSI Global: Why did life evolve to be so colourful? Research is starting to give us some answers

    Source: The Conversation – UK – By Jonathan Goldenberg, Postdoctoral Researcher in Evolutionary Biology, Lund University

    Jonathan Goldenberg, CC BY-NC-ND

    Picture a primordial Earth: a world of muted browns, greys and greens. Fast forward to today, and Earth teems with a kaleidoscope of colours. From the stunning feathers of male peacocks to the vivid blooms of flowers, the story of how Earth became colourful is one of evolution. But how and why did this explosion of colour happen? Recent research is giving us clues into this part of Earth’s narrative.

    The journey towards a colourful world began with the evolution of vision, which initially developed to distinguish light from dark over 600 million years ago. This ability probably arose in early organisms, like single-celled bacteria, enabling them to detect changes in their environment, such as the direction of sunlight. Over time, more sophisticated visual systems evolved and allowed organisms to perceive a broader spectrum of light.

    For example, trichromatic vision – the ability to detect three distinct wavelengths such as red, green and blue – originated approximately 500-550 million years ago. This coincided with the “Cambrian explosion” (about 541 million years ago), which marked a rapid diversification of life, including the development of advanced sensory systems like vision.

    The first animals with trichromatic vision were arthropods (a group of invertebrates that includes insects, spiders and crustaceans). Trichromatic vision emerged 420-500 million years ago in vertebrates. This adaptation helped ancient animals to navigate their environments and detect predators or prey in ways that monochromatic vision could not.

    Fossil evidence from trilobites, extinct marine arthropods that roamed the seas over 500 million years ago, suggests they had compound eyes. This means eyes with multiple small lenses, each capturing a fraction of the visual field, which combine to form a mosaic image. These eyes could detect multiple wavelengths, providing an evolutionary advantage in dim marine environments by enhancing the animal’s visibility and motion detection.

    Boyd’s forest dragon blends in with its habitat.
    Jonathan Goldenberg, CC BY-NC-ND

    The stage was set: organisms could see a colourful world before they became colourful themselves.

    The first burst of conspicuous colour came from plants. Early plants began producing colourful fruits and flowers, such as red, yellow, orange, blue and purple, to attract animals to help plants with seed dispersal and pollination.

    Analytical models based on present-day plant variation suggest that colourful fruits, which appeared roughly 300-377 million years ago, co-evolved with seed-dispersing animals, such as early relatives of mammals. Flowers and their pollinators emerged later, around 140-250 million years ago. These innovations marked a turning point in Earth’s palette.

    The rise of flowering plants (angiosperms) in the Cretaceous period, over 100 million years ago, brought an explosion of colour, as flowers evolved brighter and more vibrant hues than seeds to attract pollinators like bees, butterflies and birds.

    Conspicuous colouration in animals emerged less than 140 million years ago. Before, animals were mostly muted browns and greys. This timeline suggests that colour evolution was not inevitable, shaped instead by ecological and evolutionary factors, which could have led to different outcomes under different circumstances.

    Vibrant colours often evolved as a kind of signalling to attract mates, deter predators, or establish dominance. Sexual selection probably played a strong role in driving these changes.

    Dinosaurs provide some of the most striking evidence of early animal colouration.
    Fossilised melanosomes (pigment-containing cell structures called organelles) in feathered dinosaurs like Anchiornis reveal a vivid red plumage.

    These feathers probably served display purposes, signalling fitness to mates or intimidating rivals. Similarly, the fossilised scales of a green and black ten million-year-old snake fossil suggest early use of colour for signalling or camouflage.

    This snake, a juveline Bornean keeled green pit viper comes in a variety of colours.
    Jonathan Goldenberg, CC BY-NC-ND

    The evolution of colour is not always straightforward. Take poison frogs, for instance. These small amphibians display striking hues of blue, yellow, or red, not to attract mates but to warn predators of their toxicity, a phenomenon known as aposematism.

    But some of their close relatives, equally toxic, blend into their environments. So why evolve bright warning signals when camouflage could also deter predators? The answer lies in the local predator community and the cost of producing colour. In regions where predators learn to associate vibrant colours with toxicity, conspicuous coloration is an effective survival strategy. In other contexts, blending in may work.

    Clownfish lure other fish to anemone with their bright colours.
    Jonathan Goldenberg, CC BY-NC-ND

    Unlike many mammals, which have dichromatic vision and see fewer colours, most primates including humans have trichromatic vision, enabling us to perceive a broader range of hues, including reds. This probably helped our ancestors locate fruit in forests and likely played a role in social signalling. We see flowers differently from pollinators like bees, which can detect ultraviolet patterns invisible to us, highlighting how colour is tailored to a species’ ecological needs.

    A world still changing

    Earth’s palette isn’t static. Climate change, habitat loss, and human influence are
    altering the selective pressures on colouration, potentially reshaping the visual landscape of the future. For example, some fish species exposed to polluted waters are losing their vibrant colours, as toxins disrupt pigment production or visual communication.

    As we look to the past, the story of Earth’s colours is one of gradual transformation punctuated by bursts of innovation. From the ancient seas where trilobites first saw the world in colour to the dazzling displays of modern birds and flowers, life on Earth has been painting its canvas for over half a billion years.

    What will the next chapter of this vibrant story hold?

    Jonathan Goldenberg receives funding from the European Union’s Horizon Europe research and innovation program under the Marie Skłodowska-Curie grant agreement No. 101126636.

    – ref. Why did life evolve to be so colourful? Research is starting to give us some answers – https://theconversation.com/why-did-life-evolve-to-be-so-colourful-research-is-starting-to-give-us-some-answers-247136

    MIL OSI – Global Reports –

    February 18, 2025
  • MIL-OSI Global: What does the US public think about sending troops to foreign wars? Here’s what the evidence shows

    Source: The Conversation – UK – By Dafydd Townley, Teaching Fellow in International Security, University of Portsmouth

    The US public’s commitment to sending its sons and daughters to war has declined in recent years. Polls suggest that US involvement in modern conflicts is more likely to be viewed as mistaken than in the early and middle parts of the 20th century. Today, around 47% of Americans consider the Iraq war a mistake, and 43% feel the same about the war in Afghanistan.

    Recent announcements by the US president, Donald Trump, about the possibility of using US forces as part of his Gaza strategy is unlikely to improve those figures.

    On February 4, Trump proposed that the US effectively take control of the Gaza Strip and rebuild the area into what he has called the riviera of the Middle East.

    When he was asked at a press conference whether he would be willing to use US troops to secure the region, Trump answered that “as far as Gaza is concerned, we’ll do what is necessary. If it’s necessary, we’ll do that. We’re going to take over that piece that we’re going to develop it”.

    Trump walked back on that initial claim of the use of military personnel just days later, stating that the US military force would be unnecessary. “The Gaza Strip would be turned over to the United States by Israel at the conclusion of fighting,” adding that “No soldiers by the U.S. would be needed! Stability for the region would reign!” But others have suggested a US military presence would have to be involved.

    Putting US troops on the ground would fly in the face of current American public opinion. In a survey taken on February 12, only a quarter of those polled supported the prospect of US troops being sent to the region, and just over half (52%) of Republicans disapproved of the plan.

    Less than 25% of Americans supported the US taking ownership of the Gaza Strip, while 62% showed opposition to it. Less than half (46%) of Republican voters polled expressed support while only 10% of Democrats showed any kind of enthusiasm for the initiative, according to the poll.

    Of those polled, the majority said they opposed all of Trump’s plans to expand US-controlled territory, whether that was the Panama Canal, Greenland, Canada, or Gaza.

    The lack of support from the US public in deploying troops overseas has been constant since the withdrawal from Afghanistan in 2021 – and the American public appears to be questioning US military involvement in world affairs more generally.

    In a poll taken by foreign policy thinktank Defense Priorities in February 2024, 56% of respondents were “very worried” or “somewhat worried” that the presence of US troops in Syria could escalate into a broader conflict in the region. Of those that opposed a US military presence in Syria, 66% felt that it was a waste of resources.

    And just last September, a Pew Research Center poll revealed that 75% of those polled were worried about the Israel-Hamas conflict expanding in the region and US troops becoming more directly involved.

    Recruitment ad for the US Marines.

    This lack of public support for US military involvement abroad, as well as the poor recent record of recruitment into the military, may be informing Trump’s negotiations in both Gaza, and over the Ukraine war.




    Read more:
    US kicks off debate on conscription as other Nato members introduce drafts


    While the US public shows high levels of respect for those who serve in the military, around 80% of American teenagers are not interested in military service, while 55% of adults and 67% of parents are not likely to recommend it as a career to teenagers.

    The US has tried numerous recent initiatives, including offering substantial bonuses to entice recruits to join up, but without much success. The army, navy and air force all failed to reach their target recruitment numbers in 2023.

    This week Trump opened early discussions with Vladimir Putin, and latterly Kyiv, over proposals for a Ukraine peace deal. In a meeting with European defense ministers in Brussels on February 12, the new US defense secretary Pete Hegseth ruled out the participation of US troops in any peacekeeping mission in Ukraine, although in an interview with the Wall Street Journal on February 13 vice-president JD Vance did not rule out using the military.

    Hegseth also said that the US was planning to pull back from its role in European security, sparking high levels of concern from many European leaders.

    Some Republican senators have not been particularly supportive of Trump’s Ukraine proposals, especially those that have backed Ukraine over the last three years.

    In an interview, Senate armed services chair, Roger Wicker, said that “there are good guys and bad guys in this war, and the Russians are the bad guys. They invaded, contrary to almost every international law, and they should be defeated. And Ukraine is entitled to the promises that the world made to it.” Republican Senator Mike Rounds joined Wicker in demanding that: “Russia be recognised for the aggressor that they are.”

    There’s a similar level of concern on Trump’s Gaza plan – even from Trump’s close allies in the party. Rand Paul, the libertarian senator for Kentucky, suggested this idea flew in the face of Trump’s foreign policy proposals espoused during the campaign.

    “I thought we voted for America First. We have no business contemplating yet another occupation to doom our treasure and spill our soldiers’ blood,” he wrote on X.

    It is unlikely that the majority of Republican voters would be supportive of Trump’s Gaza initiative (or sending troops to Ukraine). This is partly because of the demands that it would make on the federal government – but also because of the necessity of using armed forces to implement it.

    Trump’s recent controversial executive orders have barely damaged his early job approval ratings. But the deployment of armed forces to Gaza or Ukraine runs counter to a long-term significant decline in public support for US overseas military intervention and that might be a step too far for many voters.

    Dafydd Townley does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

    – ref. What does the US public think about sending troops to foreign wars? Here’s what the evidence shows – https://theconversation.com/what-does-the-us-public-think-about-sending-troops-to-foreign-wars-heres-what-the-evidence-shows-249419

    MIL OSI – Global Reports –

    February 18, 2025
  • MIL-OSI Global: Autistic women face barriers to safe and supportive maternity care – new research

    Source: The Conversation – UK – By Aimee Grant, Senior Lecturer in Public Health and Wellcome Trust Career Development Fellow, Swansea University

    New research looks at the experiences of autistic women during pregnancy and childbirth. Zhuravlev Andrey/Shutterstock

    Childbirth is often described as one of life’s most profound experiences, but for many, it can be fraught with anxiety, pain and trauma.

    Autism is a lifelong neurotype, which affects around 3% of people. It is linked to differences in communication and sensory processing.

    Women have historically been underdiganosed with autism, diagnosed at an older age and misdiagnosed. This may explain why very little research has been conducted on the experiences of autistic women during pregnancy and childbirth – an oversight we have aimed to address in our new research.

    There are issues affecting maternity services across the nations of the UK. Last year, almost half of maternity services in England were rated as “needing improvement” or “inadequate” by England’s health service regulator, the Care Quality Commission. They also noted that communication with women – especially those from marginalised groups – could lead to fear, anxiety and having a negative birth experience.

    Following reviews of baby deaths in Scotland, inspections of maternity services are underway, with units given no prior notice. Likewise, following the death of a baby, an independent review of maternity services in Northern Ireland recommended widespread changes and additional funding to make services safe. While a review of maternity services in Wales reported that services are generally good and safe, issues have been identified in some health boards.

    In a medical context, “informed consent” means that a person understands what will happen during a test or treatment, and that they are aware that they can say “no” to having it. We know that in English maternity units, there are sometimes issues with women not being given the information needed for them to give informed consent.

    What we found

    Our research aimed to understand barriers to good maternity care for autistic people. We asked 193 autistic people from across the UK who had been pregnant to tell us what happened during their care in an online survey. It’s important to note that half of our participants weren’t aware they were autistic when they gave birth.

    Most participants told us they felt they had to “mask”, or act as though they weren’t autistic, to try to get better maternity care. Despite this, more than half said they felt they weren’t listened to by maternity staff. Almost half also said they felt staff misunderstood them and that they were unsupported.

    Worryingly, more than a third didn’t understand explanations from healthcare professions about their examinations and treatments. Nearly half said they weren’t given the choice to say no to having examinations, including vaginal examinations. This means that many of our participants weren’t able to give informed consent to the treatment they received.

    Another concerning issue was that some participants’ pain during childbirth was untreated. And ten people told us that they could tell they were on the verge of giving birth, but were not believed by maternity staff.

    Maternity services are not meeting the needs of autistic women.
    christinarosepix/Shutterstock

    When sharing their stories, most of our participants felt that staff didn’t understand autistic people, including how they communicate and experience pain. While autistic people feel pain at the same level as non-autistic people, they often show it differently, including having fewer outward signs of pain.

    Our participants also acknowledged there were issues in how maternity systems are designed, with staff appearing to have too much work to understand the needs of the individual pregnant person and change the care they give accordingly.

    Altogether we found that autistic people’s needs were not met during maternity care, with lack of consent, breached trust and safety issues common. Many of the issues we asked participants about are known to be linked to birth trauma. Our study provides initial support for a hypothesis that rates of birth trauma may be higher in autistic people.




    Read more:
    ‘Dehumanising policies’ leave autistic people struggling to access health, education and housing – new review


    Also, autistic women are at much greater risk of sexual assault compared to non-autistic peers, with one study reporting nine in ten had been victims. Research shows that sexual abuse survivors can be re-traumatised during birth.

    Participants told us that they did not have their questions about pregnancy and birth answered by maternity staff, and that this caused anxiety. So, we have worked with the autistic organisations Autistic Parents UK and Autistic UK alongside autistic maternity professionals and parents to create 114 short videos to answer their questions. They are available in English and Welsh, and are already being used by some NHS trusts.

    UK maternity services urgently need to become more autism-friendly. Things that may help include seeing the same midwife every time and having longer appointments, so that all questions can be answered.

    It’s also important for maternity staff to receive training in how to best support autistic people, which has been developed by autistic people. This is already available in England but not in the other UK nations. That should be introduced as a matter of urgency.

    Aimee Grant receives funding from the Wellcome Trust, Medical Research Council and the Morgan Advanced Studies Institute. She is a non-executive director of Disability Wales.

    Kathryn Williams receives funding for her PhD from the Economic and Social Research Council. She is a Director of Autistic UK CIC.

    Catrin Griffiths does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

    – ref. Autistic women face barriers to safe and supportive maternity care – new research – https://theconversation.com/autistic-women-face-barriers-to-safe-and-supportive-maternity-care-new-research-247017

    MIL OSI – Global Reports –

    February 18, 2025
  • MIL-OSI Global: Heat pumps have a cosiness problem

    Source: The Conversation – UK – By Aimee Ambrose, Professor of Energy Policy, Member of Fuel Poverty Evidence and Trustee of the Fuel Poverty Research Network, Sheffield Hallam University

    How we keep warm at home accounts for 17% of the UK’s greenhouse gas emissions. The UK cannot reach net zero emissions, and end its contribution to climate change, without ending its reliance on natural gas as the dominant source of heating.

    As elsewhere in Europe, heat pumps (which use electricity to draw heat out of the air or ground and circulate it indoors) are regarded as the best way to reduce carbon emissions. But are people ready to ditch their gas boilers?

    My colleagues and I spent three years researching what people need, want and expect from their heating systems by asking 300 people in eight settlements across the UK, Finland, Sweden and Romania about their experiences of trying to keep warm at home. These memories ranged from as early as 1945 to the present day.

    Among the four countries we studied, the uptake of heat pumps is most sluggish in the UK and Romania. In Sweden, heat pumps are an established technology, used to heat homes outside of dense urban areas that tend to be served by heat networks, where a boiler is shared by multiple dwellings and heat pumped to each home through pipes.

    Successive oil crises accelerated the roll-out of electric heating in Sweden during the 1970s. Our participants credited widespread trust in the Swedish government at the time for the successful adoption of heat pumps.

    Relatively low trust in the government makes it more difficult to increase heat pump uptake in the UK, a problem shared by Romania, where, low trust in the government follows decades of communist rule during which energy could be cut off to maintain supply to industries.

    When coal was king and stoves were guilt-free

    We found that there were strong attachments to high-carbon fuels in many of the communities we studied – even where people were committed to a future with low-carbon energy.

    In former coalfields, such as Rotherham in south Yorkshire and Jiu Valley in south-west Romania, people spoke wistfully of the coal industry which provided jobs, housing and plentiful fuel for heating and cooking, except during industrial disputes. The coal fire was where most of our participants let their minds linger.

    The subsequent move to natural gas heating for most UK households, which started in the 1960s, failed to evoke the same enthusiasm. People did acknowledge the benefits of being able to heat the whole home evenly with gas central heating and remembered feeling glad to no longer have to clean out the grate, but this was a less remarkable era in home heating. Participants talked about it in less detail, for less time and with less enthusiasm.

    Many of our Finnish participants, despite having heat pumps or connection to a district heating network, wanted to continue burning wood at home. This treasured practice brought a sense of wellbeing. The intense pleasure of the fireside created a sense of homeliness and enabled cultural traditions such as cooking on a wood fire, plus the multi-sensory experience of a wood-fired sauna.

    Some participants worried about being considered an “environmental criminal” for driving a diesel car, but regarded burning wood as more socially acceptable. Outside of cities, plots of woodland are inherited in some families. Gathering firewood was a ritual many enjoyed and didn’t want to give up.

    Nice, but not sustainable.
    Skylines/Shutterstock

    More affluent participants in the UK also valued their wood burning stoves – a growing trend essentially borrowed from Scandinavian neighbours. Those we interviewed in Sweden also prized their wood burners but usually only in the homes or cabins where they holidayed.

    Thermal delight

    In 1979, US architect Lisa Heschong’s concept of “thermal delight” held that building designers were forgetting the importance of enabling pleasure through heat. Our research participants had not forgotten, however, and confirmed that we seek the most joyous route to warming our bodies.

    While the necessary speed of the net zero transition entails a clean sweep that substitutes fossil-fuelled heating for low-carbon, electric alternatives, our research shows that this may be unappealing to many households.

    The people we met wanted heating options to reflect different needs and preferences. Our participants valued central heating for bringing their houses to a consistent temperature, but this did not preclude a desire for the radiant heat of the log burner on some days. They also wanted the option of plugging in a portable, electric heater when they only needed to heat one room.

    They enjoyed the contrast between the intense warmth of the fireside and a cool bedroom and many regarded an even heat throughout the home as “uninviting” – something that met their needs but not their desires. The experience of different eras of home heating had taught them the value of flexibility and variety, which makes a “clean sweep” to electric heating unattractive.

    These findings do not mean that heat pumps are doomed. Indeed, heat pumps have a lot to offer in terms of reducing heating emissions. What we found does indicate a need for multiple ways to heat the home within scenarios for reaching net zero emissions.

    The transition from coal to gas heating is within living memory in the UK.
    AstroStar/Shutterstock

    Partly, this calls for innovation in home heating technology. There is really no place for burning solid fuels in a net zero future, but a concerted effort between heating researchers, designers and technologists could create a beautiful heat source that acts as a focal point, and offers something akin to the multi-sensory joy of the fireside.

    The findings also indicate the need to change how heating transitions are talked about by the government and energy companies. Away from an implacable duty to switch heating sources and the need for efficiency, and towards the joy and abundance of a heat source that (in the case of heat pumps) offers four times the heat output for the same energy input as a gas boiler.

    The best way to sell the low-carbon heating transition is locally, where the kinds of attachments and allegiances to heat that we have uncovered are best appreciated and understood. Local authorities are typically best placed to do that.


    Don’t have time to read about climate change as much as you’d like?

    Get a weekly roundup in your inbox instead. Every Wednesday, The Conversation’s environment editor writes Imagine, a short email that goes a little deeper into just one climate issue. Join the 40,000+ readers who’ve subscribed so far.


    Aimee Ambrose receives funding from The Collaboration for the Humanities and Social Sciences in Europe (CHANSE) and The Arts and Humanities Research Council (AHRC).

    – ref. Heat pumps have a cosiness problem – https://theconversation.com/heat-pumps-have-a-cosiness-problem-249529

    MIL OSI – Global Reports –

    February 18, 2025
  • MIL-OSI USA: UConn Researchers Tracking Change in Precious Ecosystems

    Source: US State of Connecticut

    Primary forests, or old-growth forests as they are sometimes called, are epicenters of rich biodiversity, are more resilient than younger forests, and store significantly more carbon than their younger counterparts, to name just a few of the vital roles of these essential and irreplaceable ecosystems. The preservation of primary forests is the focus of global conservation efforts.

    The UConn Global Environmental Remote Sensing (GERS) Lab has developed a new remote sensing method to continuously monitor primary forest loss and determine what factors are driving that loss. Their findings are published in Remote Sensing of Environment.

    Lead author and Department of Natural Resources and the Environment Ph.D. student Falu Hong says that they focused on these key habitats on the island of Hispaniola, which includes Haiti and the Dominican Republic, using satellite images from the years 1996-2022.

    “We used a satellite time series to track primary forest loss, and we focused on these two countries because they have experienced significant primary forest loss and because they are ignored in previous studies, especially Haiti, which is one of the hotspots of biodiversity loss,” says Hong. “We analyzed the forest loss over 27 years of land cover change, which has not been done in previous studies.”

    The researchers analyzed multiple dimensions of forest loss, including the primary forest inside and outside of protected areas and the drivers of forest loss. They applied a method called the COtinuous monitoring of Land Disturbance algorithm (COLD) and remote sensing data from Landsat to create a map of the primary forest loss.

    Ji Won Suh, a postdoctoral researcher in the GERS lab, says this study showcases the power of using Landsat time series data.

    “So few studies focus on primary forests because it is very difficult to map them using remote sensing signals. Sometimes it is difficult to differentiate a secondary forest or regenerated forest from a primary forest, but this study successfully classified those primary forests using a random forest machine learning model.”

    Suh says the accuracy of the map was verified by their collaborator and co-author S. Blair Hedges from Temple University, who is an expert on primary forests on Hispaniola Island.

    “Another unique part of this study is we created a primary forest map over time,” Suh says. “Usually other studies just focused on a one-time event. We can track the loss of primary forests over many years. Our study is a way where we can map the trajectory of loss as it happens and we can analyze why those losses happen.”

    They found the main drivers of primary forest loss in Haiti are fire, which caused around 65% of the observed losses, followed by logging which accounted for about 20% of the primary forest loss, and around 10% of the forest loss was attributed to hurricane damage.

    “We found that in 2016, Hurricane Matthew destroyed around 12% of the primary forest in Haiti, just in one year,” says Hong. “That’s a huge amount of loss. With our map we can visualize the primary forest change and analyze the drivers causing that change. We can also analyze forest fragmentation. Usually, primary forests are homogeneous, but activities like construction or logging result in the forest becoming more and more fragmented. We quantified the fragmentation level of the primary forest which could give good insight into biodiversity conservation and preservation.”

    They also found that primary forest fragmentation is more pronounced in Haiti, where patches of primary forest are smaller and less numerous. Primary forests in both Haiti and the Dominican Republic are located on steep terrain, indicating that primary forests located in flatter and more accessible areas are prone to development and forest destruction.

    This paper is the first step in a larger project, says Hong, where the next steps are to begin expanding the mapping across the Caribbean region to evaluate the impact of primary forest loss on biodiversity change.

    GERS Lab Director and Associate Professor in the Department of Natural Resources and the Environment Zhe Zhu says that as primary forests have the lion’s share of biodiversity, many of the species living there are also endangered, so the preservation of these irreplaceable ecosystems is paramount. Having a reliable method to map primary forests accurately will help in the effort,

    “One thing I want to emphasize about this work is that it is very difficult to identify between different forests like primary dry forests, primary wet forests, and secondary forests, for example. A primary forest may look very similar if the secondary forest is old enough. You can have very subtle human disturbances causing it to no longer be a primary forest. You need to know the driver and how severe the drivers are. You also need to know the resilience of the trees.”

    This work is supported by a $2 million NSF grant with the goal of linking remote sensing to track biodiversity through time.

    “We are treating remote sensing as a time machine to backward and forward to forecast future impacts on biodiversity. It is a very fun project that a lot of us in the GERS lab are working on,” says Zhu.

    Tracking the impacts on biodiversity and the drivers of change is important for conservation and policymaking, and studies like this can yield surprising results and insights into what needs to happen to preserve vital ecosystems like primary forests.

    This work was supported by a grant from the NSF Biodiversity on a Changing Planet (BoCP) program (2326013 and 2326014).

    MIL OSI USA News –

    February 18, 2025
  • MIL-OSI Global: Too distracted to watch? Netflix has the perfect ‘second-screen’ show for you

    Source: The Conversation – Canada – By Daphne Rena Idiz, Postdoctoral fellow, Department of Arts, Culture and Media, University of Toronto

    Overly expository dialogue, repeating plot points and lots of voice-overs to narrate action help distracted viewers along. (Shutterstock)

    Netflix knows we’re on our phones while we watch TV. Recent articles discuss Netflix’s or streamers’ requests for creatives to produce content optimized for casual viewing, meaning intentionally scripted for distracted viewers.

    I’ve spent the last few years researching how Netflix shapes European screen production, a region where the streaming giant has invested billions in original content.

    I first encountered the concept of “second-screen shows” — created with distracted viewing in mind — in 2022.

    At the time, I was doing interviews with producers, showrunners, screenwriters and directors who had worked on European Netflix originals (due to confidentiality, they have been given pseudonyms here). Two of my interviewees described what they saw as very unusual feedback coming from Netflix executives: make a show that the audience can follow without looking at the screen.

    Recipe for a ‘second-screen show’

    So, how exactly do you make a second-screen show?

    One of my interviewees, Eleven, said that Netflix explicitly labels certain series “second-screen shows” and develops them as such. Another, Tokyo, shared their experience encountering similar directives:

    “[Netflix] basically said, ‘What you need to know about your audience here is that they will watch the show, perhaps on their mobile phone, or on a second or third screen while doing something else and talking to their friends, so you need to both show and tell, you need to say much more than you would normally say. […] You need your audience to understand what’s going on, even if they’re not looking at the screen.’”

    These series are designed around the viewing behaviours of their target audience, described by my interviewees as “younger” and “young adult” viewers.

    As Eleven explained, a Netflix executive would talk about how “in this show, we have to make sure that the points come through, even though kids are watching TikTok while they watch it.”

    Because Netflix knows a certain target audience will be “second-screening” these series, the streamer wants the show’s writing to facilitate this practice. Concretely, this means overly expository dialogue, repeating plot points and adding lots of voice-overs to narrate the action and help the distracted viewer follow along.

    Other sources cite examples where screenwriters were told to have characters announce what they’re doing and make the show less distracting from the viewer’s “primary screen” (their phone).

    Eleven joked about how if a character was sad, Netflix would ask to include a line of dialogue for the character saying, “I’m sad” with tears streaming down their face, while rain pours, and mournful violins play in the background.

    Here, the golden rule of screenwriting “show, don’t tell,” is cast aside for “show and tell” (and tell again). Joking aside, they reflected: “It saddens me, on behalf of great storytelling traditions.”

    The revival of casual viewing

    But are second-screen shows really the final nail in the coffin for prestige TV? The idea of casual or background viewing is not new.

    There is a long history of content targeting the distracted viewer.
    (Shutterstock)

    From soap operas to sitcoms to reality TV, there is a long history of content targeting the distracted viewer.

    Sometimes we’re just tired and need an easy watch. But these types of series are a far cry from the era of HBO-style Netflix, hyping itself as the home of quality TV, a place where showrunners could find unprecedented creative freedom.

    There is still a time and place for complex storytelling. But data suggests
    that over half of viewers in many national markets — including in India, the United Arab Emirates, Australia, the United States, Britain and Denmark — are periodically checking their phones while watching TV. And Netflix is creating shows that enable this ritual.

    ‘Cult’ of data

    Netflix’s strategy has always hinged on a granular understanding of its users. Netflix collects a huge amount of data on its subscribers and their viewing behaviors: what they’re watching, how, when, where and on what device. This information is used by teams of data scientists to not only improve Netflix’s personalization but also to help with decisions about what content to develop and how.

    Yet research suggests Netflix has really cultivated the “myth of big data,” flip-flopping over the years about how much data influences the creative process of Netflix productions.

    And while screen workers may resist what they sense about analytics as they participate in creative processes, ultimately, it is the executives greenlighting content who interpret data and choose how to use it.

    Geralt, another producer I interviewed, described how “whenever you talk to the algorithm people and the data people at Netflix, it feels like a cult. They talk about the algorithm like it’s a god, like ‘Well the algorithm tells us…’”

    One part of the content strategy

    With that said, it’s critical to take blanket statements about Netflix’s operations with a grain of salt.

    The behemoth operates in more than 190 countries, with offices in 30, housing different teams and producing content around the globe. It’s estimated that 589 new Netflix originals were added in 2024.

    Recent articles about “second screen” productions focused on the U.S. context, and my research did not seek to determine how many Netflix productions are made this way.

    Netflix’s goal these days, according to CEO Ted Sarandos, is to be “equal parts HBO and FX and AMC and Lifetime and Bravo and E! and Comedy Central.”

    Second-screen shows, it seems, are one part of this strategy.

    Outlook for storytellers

    It’s clear that viewing behaviours are driving changes in storytelling. But for screenwriters today, second-screen shows are only a symptom of bigger problems.

    Between a shrinking drama market and the competition for attention from platforms like YouTube and TikTok, streamers are investing a lot less in content than they used to. They’re also much more risk-averse with these investments.

    Even before now, producing for streamers brought its own set of challenges.

    Writer advocates with the 2023 TV writers strikes highlighted how streaming introduced new and exciting formats for TV writing, but also a new kind of precarity. And concerns continue to loom around how AI might impact creativity, career sustainability and IP rights.

    Last year, the Canadian Media Producers Association joined production organizations around the world in issuing a call for streaming regulation that underscores independence, IP rights and fair remuneration.




    Read more:
    Online Streaming Act: As we revisit Netflix support for Canadian content, it’s about more than money


    It’s no surprise the mantra across the media industries last year was “survive ‘til ’25.”

    As media creators become increasingly dependent on data-driven tech companies, they will continue producing content to the whims of executives following the holy algorithm.

    The next time you’re watching a Netflix show and feel the urge to scroll during another repetitive voice-over, the question is: Are some shows written like this because the audience is disengaged, or is the audience disengaged because shows are written like this?

    Daphne Rena Idiz does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

    – ref. Too distracted to watch? Netflix has the perfect ‘second-screen’ show for you – https://theconversation.com/too-distracted-to-watch-netflix-has-the-perfect-second-screen-show-for-you-249012

    MIL OSI – Global Reports –

    February 18, 2025
  • MIL-OSI Global: What Canada can learn from the European Union about dealing with chaos and crises

    Source: The Conversation – Canada – By Jörg Broschek, Professor and Laurier Research Chair, Political Science, Wilfrid Laurier University

    As United States President Donald Trump continues to threaten Canada’s economic and political sovereignty, some observers have floated the idea of Canada becoming a member of the European Union.

    Since there is no feasible pathway to EU membership in the short term, current efforts rightly focus on strengthening Canada’s existing trade relationships, most notably through the Canada-European Union Comprehensive Economic and Trade Agreement.

    But something else is often overlooked: Canada should also learn from the EU how to cope with the monumental challenges ahead. Europe is not only less vulnerable than Canada due to its geographic position and economic power, it’s also more resilient.

    Three goals

    Unlike “Team Canada,” “Europe United” has already crafted a multi-pronged policy framework to encounter the risks arising from a fundamentally changing geopolitical environment over the long term. The EU also has a more robust institutional framework for intergovernmental co-operation.

    Under the leadership of President Ursula von der Leyen, the European Commission has launched a cascade of relatively coherent policies aimed at facilitating three broad goals: decarbonization, economic sovereignty and national security.

    Key pillars of this new policy framework are the European Green Deal of 2019, the European Industrial Strategy of 2020, the European Economic Security Strategy of 2023 and the 2024 European Defence Industrial Strategy.

    These policy initiatives have been continuously updated, fine-tuned and aligned with each other. They have created an umbrella that enables the EU and its member states to simultaneously promote the green transition, strengthen the internal market and domestic industries as well as reduce economic and security risks.

    The geopolitical and industrial changes in the EU resemble what used to exist in Canada as well: national policies — the conscious, nation-building initiatives of successive federal governments.

    But Canada has lost the ability to plan strategically for the long term and now responds to every crisis in a reactive, punctuated manner. In doing so, Canadian officials address symptoms without tackling root causes.

    EU architecture

    The institutional architecture of the EU also furnishes governments with more capacity to collaborate. In all federal systems, most policies are largely shared, which is why intergovernmental co-ordination is important to buttress and consolidate such innovations.




    Read more:
    Canada-U.S. history provides lessons on how Canada can deal with a hostile Donald Trump


    Notably, the Council of the European Union plays a key role for co-ordinating and negotiating policies, in addition to its function as the main decision-making body (together with the European Parliament).

    It is composed of ministers of the EU member states. Accordingly, it works in different configurations, depending on the portfolio. The head of governments themselves meet regularly through a separate institution, the European Council.

    In Canada, by contrast, federal intergovernmental institutions are fragile or don’t even exist, even though they’re comparatively strong on the municipal level.

    Municipalities co-ordinate through the Federation of Canadian Municipalities (FCM), which was established in 1901. But it was not until 2004 that provinces and territories established the Council of the Federation. This body, however, has remained weak, with very little administrative support.

    What’s even more striking is that there is no formalized, institutionalized framework at all at the federal level. The First Ministers’ Conference meetings are held at the discretion of the prime minister. In their communique following a Council of the Federation meeting in November 2023, premiers complained that “the prime minister has not convened a full in-person First Ministers’ Meeting since December 2018 despite repeated requests from premiers.”

    Widespread tariffs against Canada may be on hold until March, but there is no way back. As Canadians experience their very own “Zeitenwende” — the end of an era — in the wake of Trump’s desire to absorb Canada into the U.S., the country’s leaders should draw two lessons from the EU.

    All-encompassing approach needed

    On the policy level, Canada does need a new “national policy,” as I have argued previously.

    More than 40 years ago, the Macdonald Commission paved the way for a major transformative shift in Canadian policy-making, including free trade with the U.S. But since the global financial crisis of 2007-2008, it has become increasingly clear that this model of socioeconomic development is outdated.

    Yet the model has never been replaced. Unlike the EU, Canadians have comforted themselves with patchwork policies instead of crafting a new, all-encompassing approach.

    The challenges the EU and Canada face are similar, but Canada needs to find its own response. Forging a new model will require mobilizing and aligning key sectors like trade, infrastructure and industrial policy in a coherent manner.

    On the institutional level, Canada must — finally — institutionalize Team Canada. It’s a positive development that First Ministers’ Conference meetings have resumed, but an ad hoc approach to intergovernmental collaboration is no longer sufficient.

    Team Canada may work under pressure when facing a short-term threat. Without a stronger institutional foundation, however, Canada won’t be able to consolidate a new national policy over the long term.

    The EU has accomplished a remarkable resurgence, despite all remaining difficulties. Rather than chasing the idea of joining the EU, Canada should use the European example as a road map for enhancing its policy and governance capacities.

    Jörg Broschek receives funding from the Social Sciences and Humanities Research Council of Canada (SSHRC).

    – ref. What Canada can learn from the European Union about dealing with chaos and crises – https://theconversation.com/what-canada-can-learn-from-the-european-union-about-dealing-with-chaos-and-crises-249462

    MIL OSI – Global Reports –

    February 18, 2025
  • MIL-OSI Global: Canadian immigrants are overqualified and underemployed — reforms must address this

    Source: The Conversation – Canada – By Marshia Akbar, Director of the BMO Newcomer Workforce Integration Lab and Research Lead on Labour Migration at the CERC Migration and Integration Program at TMU, Toronto Metropolitan University

    Canada’s labour market struggles are not caused by the number of newcomers, but by systemic issues such as underemployment and skills-job mismatches. (Shutterstock)

    Recent immigration reforms in Canada have cut international student and temporary resident numbers, restricted work permits for them and their spouses and aim to reduce permanent resident admissions by 21 per cent in 2025, with further cuts ahead.

    Such changes are aimed to avoid competition with local unemployed Canadians at a time of rising unemployment. However, these changes may eventually intensify dysfunctions in the Canadian labour market.

    With an overall unemployment rate of 6.6 per cent and a youth unemployment rate of 13.6 per cent alongside a worsening housing crisis, these policies reflect growing pressures.

    However, blaming newcomers — particularly international students and their spouses — for job shortages overlooks deeper structural issues in the labour market. Canada’s labour market struggles are not caused by the number of newcomers, but by systemic issues such as underemployment and skills-job mismatches.

    Unemployment and underemployment

    While rising unemployment is affecting everyone, newcomers have been hit especially hard. In 2024, the unemployment rate for immigrants hit 11 per cent — more than double the 5.6 per cent rate for Canadian-born workers.

    Underemployment is also a persistent issue for immigrants. In 2021, only 44 per cent of immigrants who had arrived in Canada within the previous decade were employed in jobs matching their education level, compared to 64 per cent of Canadian-born workers aged 25 to 34.

    The over-education rate — the proportion of university graduates working in jobs for which they are over-qualified despite holding a bachelor’s degree or higher — was 26.7 per cent for immigrants, more than double the 10.9 per cent rate for Canadian-born workers in 2021.

    Immigrants, particularly those with foreign credentials, are significantly more likely to experience these job-education mismatches compared to Canadian-born workers.

    Approximately two thirds of recent immigrants held a degree from a foreign institution. The over-education rate for these immigrants was 24 per cent higher than that of younger Canadian-born workers.

    Under-employment experienced by many newcomers is largely driven by employers favouring Canadian experience — despite such preferences being illegal in Ontario — and relying on referral networks, which often disadvantage newcomers.

    Hiring managers frequently undervalue international credentials, even when assessed by organizations like World Education Services. Many employers struggle to assess foreign work experience. Some also perceive a lack of familiarity with Canadian workplace norms as a hiring risk.

    Ultimately, hiring managers tend to choose the less risky option, as a bad hire can reflect poorly on them. An exceptional hire, on the other hand, doesn’t necessarily bring them equivalent rewards.

    International experience is undervalued

    International graduates with Canadian degrees generally achieve better labour market outcomes than those educated entirely overseas, experiencing higher earnings and improved job matches.

    However, many still face significant barriers, primarily due to employers’ preference for specific Canadian experience and biases in assessing their skills.

    Although many international students (277,400 in 2018) gain Canadian work experience during their studies and develop soft skills — often in low-paying, customer-facing roles such as accommodation and food services, retail, hospitality or tourism — this experience is often dismissed as irrelevant to professional roles.

    This creates a paradox: employers require Canadian experience for entry-level positions in their field, yet without prior experience, graduates struggle to get hired in the first place.

    In addition, employers often lack clarity about international graduates’ visa statuses, work permit durations and future stays in Canada. Constantly changing policies exacerbate this confusion, deterring employers from hiring.

    A path forward

    Canada’s long-term competitiveness is hindered not by immigration, but by systemic labour market discrimination and inefficiencies that prevent skilled newcomers from fully contributing to the economy.

    Eliminating biases related to Canadian work experience and soft skills is key to ensuring newcomers can find fair work. The lack of recognition of foreign talent has a detrimental effect on the Canadian economy by under-utilizing valuable human capital.

    To build a more inclusive labour market, a credential recognition system should support employers in assessing transferable skills and experience to mitigate perceived hiring risks related to immigrants.

    For international students, enhanced career services at educational institutions are critical. Strengthening partnerships between universities, colleges and employers can expand internships, co-op placements and mentorship programs, providing students with relevant Canadian work experience before graduation.

    Such collaboration is also key to implementing employer education initiatives that address misconceptions about hiring international graduates and highlight their contributions to the workforce.

    Artificial Intelligence (AI) can also play a role in reducing hiring biases and improving job matching for new immigrants and international graduates. Our recent report, which gathered insight from civil society, the private sector and academia, highlights the following AI-driven solutions:

    • Tools like Toronto Metropolitan University’s AI resume builder, Mogul AI, and Knockri can help match skills to roles, neutralize hiring bias and promote equity.

    • Wage subsidies and AI tools can encourage equitable hiring, while AI-powered programs can help human resources recognize and reduce biases.

    • Tools like the Toronto Region Immigrant Employment Council Mentoring Partnership, can connect newcomers with mentors, track their skills and match them to employer needs.

    Harnessing AI-driven solutions, alongside policy reforms and stronger employer engagement, can help break down hiring barriers so Canada can fully benefit from the skills and expertise of its immigrant workforce.

    Marshia Akbar receives funding from the Social Sciences and Humanities Research Council of Canada (SSHRC).

    Anna Triandafyllidou receives funding from the Social Sciences and Humanities Research Council of Canada (SSHRC), the Tri-Agency of Research Councils, Canada and Horizon Europe framework program of the European Commission.

    – ref. Canadian immigrants are overqualified and underemployed — reforms must address this – https://theconversation.com/canadian-immigrants-are-overqualified-and-underemployed-reforms-must-address-this-247974

    MIL OSI – Global Reports –

    February 18, 2025
  • MIL-OSI Global: Namibia’s Shark Island: Europe’s push for green hydrogen risks compromising sites of colonial genocide

    Source: The Conversation – Canada – By Rosanna Carver, Postdoctoral Research Fellow, University of Victoria

    An aerial view of Shark Island and the town of Lüderitz in Namibia. (Black Court Studios)

    In September 2025, Namibia will host the Global African Hydrogen Summit. The Namibian government has ambitions to turn the country into a leading producer of green hydrogen for export to markets in Europe and elsewhere. However, the lands and waters now regarded as being essential to Europe’s energy transition are tied to traumatic memories of colonial violence; especially the ocean, which is the final resting place for thousands of Namibians.

    As countries around the world transition to renewable energy, an inconspicuous peninsula in Namibia known as Shark Island is positioned to play a key role in the production of so-called “green” hydrogen, which is a proposed alternative to fossil fuels.

    However, the peninsula and its waters are at risk of being compromised by proposed port expansions to support the transportation of green hydrogen. Shark Island, near the town of Lüderitz, is now a campsite for tourists.

    But Shark Island is also called Death Island, and it was a concentration camp and a site of genocide during German colonial rule from 1884 to 1915. The concentration camp has since been destroyed, leaving little evidence of the violence that occurred there. However, recent international investigations highlight what many Namibians have known and worked on for generations.

    Germany’s colonization and genocide

    In 1884, German colonizer Adolf Lüderitz annexed Namibia, intending to finance colonial rule through minerals. Between 1904 and 1908, German colonial forces killed approximately 100,000 people (80 per cent of the Herero and half of the Nama population). The genocide also affected the ǂNukhoen and the ǂAonin communities.

    During the genocide, those who were not immediately killed were sent to concentration camps, where they were forced to perform manual labour, such as working on railways and harbours. This occurred across Namibia, including on the coast: in Swakopmund and Lüderitz alone, more than 1,550 Nama died.

    The research agency Forensic Architecture has digitally reconstructed the camps and identified evidence of burial places. On Shark Island, they demonstrate that the port expansion “poses further imminent risk to the site.”

    Attention has been given to the land-based component of green hydrogen projects including the multinational joint venture, Hyphen Energy. But the ocean, which Namibia’s development projects also interact with, is often overlooked as a space of memory, justice and relations. This is in part due to colonial and apartheid histories that erased or excluded people from the coasts and oceans.

    During colonial rule, German colonizers incarcerated Namibians offshore aboard ships. They also threw the bodies of those who had died in the concentration camp into the ocean. The local saying “the sea will take you” highlights how the ocean is involuntarily tied to memories of death and trauma.

    Namibians have not forgotten the violence that occurred on the land and at sea. Local groups are restoring grave sites and establishing memorials. The discussion of recognition, justice and equitable rights and access to the coast and ocean are important for Namibia’s communities and the decedents of those killed during the genocide.

    Waves of energy colonialism

    Green hydrogen has a central role in global decarbonization ambitions. Namibia is considered an “export production site” for Europe’s future hydrogen economy. This is due to its solar and wind potential, and access to the ocean.

    Hydrogen can only be produced in Namibia if the infrastructure exists to enable it. For example, hydrogen requires the industrial and transportation infrastructure to get it to international markets. To meet these demands, the Namibian Ports Authroity is proposing port expansions in the city of Walvis Bay and Lüderitz, where expansion could have implications for Shark Island and its waters.

    Campaigners in Namibia are demanding the government and industry halt the expansion plans on Shark Island, and meaningfully engage with reconciliation. Among them is the Windhoek-based Black Court Studio, where Natache Iilonga, co-author of this article, is the creative director.

    These proposed developments signal the continued European dominance in Namibia’s blue and green economy projects. They enable energy colonialism, where the push for green energy continues colonial injustices. European countries and industry perpetuate ecological, social and cultural harm to satisfy their own climate change agendas.

    Projects and partnerships between Namibia and European countries like Germany are emblematic of (neo)colonial power relations. While these projects propose to foster co-operation, they also continue to dispossess communities from their lands and waters, and erase environmental and cultural relations.

    Through “development assistance,” the German government and non-governmental organizations continue to influence economic projects in Namibia, while avoiding discussion of meaningful reparations for colonial crimes.




    Read more:
    Germany’s genocide in Namibia: deal between the two governments falls short of delivering justice


    The land and ocean are not merely passive witnesses to colonial violence. Black Court Studio incorporates the ocean as a dynamic participant in the conversation about these violent histories, and justice and healing. Through community exercises and counter-mapping, the studio explores people’s socio-cultural relations with the ocean.

    Together, the studio’s interventions are beginning to resituate previously erased and forgotten connections with Shark Island. This work also highlights cultural and spiritual relations with the ocean that persist despite this dispossession.

    Namibia’s ocean and coasts are not empty spaces to be exploited for the benefit of Europe’s energy future. A deeper understanding of histories, and present day connections, provide lessons for meaningful reconciliation.

    Natache Iilonga is a practicing architect with Iilonga Architects Inc and the co-founder of Black Court Studios Namibia.

    Rosanna Carver does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

    – ref. Namibia’s Shark Island: Europe’s push for green hydrogen risks compromising sites of colonial genocide – https://theconversation.com/namibias-shark-island-europes-push-for-green-hydrogen-risks-compromising-sites-of-colonial-genocide-239549

    MIL OSI – Global Reports –

    February 18, 2025
  • MIL-OSI United Kingdom: Residents urged to help shape transformational multi-million-pound town centre plans

    Source: City of Stoke-on-Trent

    Published: Monday, 17th February 2025

    Residents are to be asked to have their say on multi-million-pound plans to transform three town centres.

    A total of £6 million in Government funding is being spent to carry out public realm improvements in Tunstall, Burslem and Stoke.

    The aim is to help attract extra footfall, boost business activity and tap into the heritage of each of the towns.

    In Tunstall, proposals include better connecting key areas of the town like the High Street, Tower Square, Butterfield Place and Alexandra Park shopping area and creating a high-quality public space for the town focussed on Tower Square and the Clock Tower.

    In Burslem, the proposed public realm work will focus on the Queen Street area, which is the location of several Listed Buildings including Burslem School of Art and the Wedgwood Institute. The scheme will improve the historic Conservation Area with natural stone paving and new tree planting to create an attractive environment, encouraging more people into the town centre.

    In Stoke, the money is to be centred on improving the public square on South Wolfe Street – adjacent to Stoke library – to create a vibrant community space for regular outdoor events, including street markets and live music events.

    Public consultations will take place over the next few weeks and will give people the chance to have their say on their priorities. Feedback will help shape the work, which will start later this year.

    The consultation events are:

    • Burslem – Burslem School of Art – Friday, 21 February (10am to 4pm) and Saturday, 22 February (10am to 2pm).
    • Tunstall – Tunstall Indoor Market – Friday, 7 March (10am to 4pm) and Saturday, 8 March (10am to 2pm).
    • Stoke – Stoke Indoor Market – Friday, 7 March (10am to 4pm) and Saturday, 8 March (10am to 2pm).

    The work will complement other Government-funded projects taking place in and around the towns, such as the £3.5 million former Tunstall Library building development; a £20 million transformation of the Spode site in Stoke; and multi-million-pound investment to preserve, protect and bring into use Burslem’s historic buildings – including The Wedgwood Institute and Burslem Indoor Market.

    Councillor Finlay Gordon-McCusker, cabinet member for Transport, Infrastructure and Regeneration at Stoke-on-Trent City Council, said: “We really want as many people as possible to feedback on these exciting plans which will help to breathe new life into Burslem, Stoke and Tunstall town centres.

    “The regeneration proposals we have set out will improve the economic viability of the towns, help better link key parts of the towns and are being developed to complement the other exciting project work taking place.

    “So, it is vital to hear residents’ priorities for spending the money and to use these ideas to shape the final plan ready for work to start later this year.”

    More details on the proposals will be added here closer to the consultation events: www.stoke.gov.uk/publicrealm 

    People can also have their say by emailing: PublicRealm@stoke.gov.uk

    MIL OSI United Kingdom –

    February 18, 2025
  • MIL-OSI United Kingdom: Start a new career in child and family social work

    Source: City of York

    The Step Up To Social Work programme at City of York Council is now open for recruitment, enabling aspiring social workers to apply for a place on the training course.

    The Step Up To Social Work programme at City of York Council opens for recruitment today [17 February] until 25 March.

    Step Up To Social Work is a 14-month, full-time training programme for talented graduates and career changers to become the next generation of child and family social workers supporting vulnerable children, young people and families. It is designed for people who want to become a social worker but do not have a degree in social work. Successful applicants train through a combination of academic study and hands on social work experience in a local authority.

    Applicants eligible for the programme, which includes financial support alongside training, will be individuals with experience of working with vulnerable children, young people, families or adults, and who can demonstrate emotional resilience and potential for success.

    Step Up To Social Work aims to attract applicants from a diverse range of backgrounds and aims to have a workforce that represents the society that we serve.  

    City of York Council is looking for four recruits as part of the scheme.

    Cllr Claire Douglas, Leader of City of York Council said:

    Social work is a challenging and incredibly fulfilling profession, which really does change lives for the better.

    “People may not know exactly what being a social worker involves but we have lots of experienced professionals who can explain the role for those who want to learn more. I’d encourage anyone who’s wondered about social work to get in touch and find out how being a social worker can benefit children and families in York.  And for those who join us, we have fantastic, dedicated, and enthusiastic social work teams in who will support and guide you every step of the way.”

    Isabelle Trowler, Chief Social Worker for Children and Families, said: 

    It is excellent to see the quality of the hundreds of graduates who qualify as social workers through the Step Up programme, and I’m encouraged to see them start out on a long-term career in social work. Our profession is highly challenging, but highly rewarding, and Step Up is developing a highly skilled workforce ready to make a genuine positive impact on people’s lives.

    The Step Up programme is backed by the Department for Education to support 700 individuals to enter the social work profession in local authorities across England in 2026. This funding will support individuals with training costs and a bursary of £21,995 over the duration of the programme to support them whilst in training.

    This will be the ninth cohort of Step Up since 2010, the programme has successfully supported over 2,900 social workers to enter the profession across England.  

    More information about the programme and how to apply is available at https://susw.eu-careers.pocketrecruiter.com/
     

    MIL OSI United Kingdom –

    February 18, 2025
  • MIL-OSI United Kingdom: GB Energy & Grangemouth show ‘You can’t trust Labour’

    Source: Scottish National Party

    ‘You can’t trust Labour’. It was an oft made comment during the latter year’s of Tony Blair’s premiership; particularly because of his role in dragging the UK into the Iraq war on the basis of a lie.

    But it took six years for that phrase to become common usage. With the current Westminster Labour government of Keir Starmer it’s only taken six months.

    And recently we saw an example which explains why trust in Keir Starmer’s Labour party has nosedived.

    Before the 2024 election Labour promised that Aberdeen would get 1,000 jobs from hosting the GB Energy headquarters; but now the appointed boss of GB Energy says it will only create 200 jobs in five years.

    The GB Energy boss who won’t even be working in Aberdeen but Manchester! So much for a ‘headquarters’ in Aberdeen.

    These revelations have been followed more recently by news that Grangemouth’s refinery is to close after 100 years.

    Again, another example of how Labour can’t be trusted.

    Before the election Labour, along with Keir Starmer and Anas Sarwar, promised to save the jobs:

    :wilted_flower: Before the Westminster election, Labour promised to save Grangemouth – they’ve broken that promise. :point_down: pic.twitter.com/coS3gDL2l0

    — The SNP (@theSNP) February 6, 2025

    Now it’s scenes of Anas Sarwar repeatedly pleading that he’s powerless because it’s a private company…

    Anas Sarwar promised in the 2024 Westminster election that Labour would save the jobs at Grangemouth.

    Labour broke that promise. pic.twitter.com/GAng87jhjz

    — The SNP (@theSNP) February 7, 2025

    …a private company Labour will financially support when it comes to a football stadium in England and a refinery in Belgium!

    Labour Government promises to back Ineos owners Old Trafford project and has also been given a £600million loan guarantee by the UK Labour Government for a refinery in Belgium.

    I thought Labour promised to save Grangemouth! pic.twitter.com/MMqNq2TaqR

    — Gordon Macdonald MSP (@GMacdonaldSNP) February 9, 2025

    And it was Westminster who tied their own hands when it gave Grangemouth to the private sector:

    The SNP’s @MichelleThomson on @bbcdebatenight reminding us that Westminster tied its own hands when it gave Grangemouth to the private sector.

    Once again, Scotland is bearing the brunt of Westminster’s mismanagement. pic.twitter.com/kVeXePfm7k

    — The SNP (@theSNP) February 6, 2025

    Is it any wonder that even Grangemouth’s own Labour MP sounds like he doesn’t trust Labour?

    :face_with_peeking_eye: Even Labour’s own MPs know they’ve betrayed Grangemouth

    Watch below :point_down: pic.twitter.com/roXsOcP17m

    — The SNP (@theSNP) February 10, 2025

    Even a letter he wrote to Starmer was signed by only one other Scottish Labour MP. So much for Scottish Labour MPs standing up for Scotland.

    But those two examples are just the tip of the iceberg when it comes to Labour promises.

    Take the WASPI women pensioners; betrayed so often by the Tories and now by Labour. As leader of the opposition, Starmer promised to “do something about it”, saying he understood their anger at having “the goalposts moved”.

    In 2020 he railed against the two-child cap on child benefits. In the days running up to the election Scots were told to vote Labour to end child poverty.

    Yet just after the election he suspended seven Labour MPs for voting with the SNP to scrap the cap on child benefit and tackle child poverty.

    Then there’s the winter fuel payment for pensioners. In the run up to voting in July 2024 Starmerrailed against the Tories about how pensioners suffered under the Tories and promised them security.

    Safely in Downing Street his government announced a cut to pensioners’ winter fuel payments despite research by his own party that it could cause 4,000 deaths.

    And what about National Insurance?

    Labour’s manifesto specifically pledged that they would not raise national insurance. In her budget Rachel Reeves increased employer national insurance – a policy that will hit those employing lower paid workers the hardest, charities, GPs and care homes.

    You would think such a level of untrustworthy behaviour would be more than enough after seven months; but there’s more that specifically affects Scotland.

    In the July 2024 election Anas Sarwar expressly promised that Scottish Labour ‘would put Scotland at the heart of Starmer’s government‘; and ‘stand up to Keir Starmer and defend Scotland’s interests‘.

    Instead, as a group, Scottish Labour MPs have meekly voted for cutting the winter fuel payment, keeping the two-child benefit, and failing to support WASPI women.

    And there’s a range of issues where that group of MPs have been subdued when it comes to putting Scotland at the heart of Starmer’s government.

    In August 2024 Rachel Reeves pulled funding for an £800 million computer at Edinburgh University with a Labour source saying the project made “little strategic sense.”

    Yet by January Keir Starmer was announcing that his government had arranged £14 billion of investment in various AI projects.

    At the end of January Rachel Reeves announced her plans for growth in the UK … which amounted to a concentration of UK government assistance between the cities hosting the UK’s two elitist universities.

    The absence for similar assistance for Scotland was notable despite claiming it would deliver to “all corners of the UK“:

    Labour have announced they are investing taxpayer money to drive growth “across the UK”.

    Guess where almost all of it is going. pic.twitter.com/Fo332thuae

    — The SNP (@theSNP) January 30, 2025

    Take CCS, or Carbon Capture & Storage; since the 2014 independence referendum the North East of Scotland has been repeatedly promised that Westminster would invest millions in it.

    Rachel Reeves eventually announced funding for Carbon Capture & Storage … in Teesside and Merseyside. No Scottish Labour MP or MSP has even mentioned this slap in the face to Scotland.

    Is it any wonder Scots believe Anas Sarwar doesn’t stand up to Keir Starmer. It’s no wonder Scottish Labour’s vote is at its lowest level in three years.

    And what is Anas Sarwar’s latest move as we approach a Scottish election year? To say he is open to ‘good ideas’ from Nigel Farage’s Reform party.

    A party that would like to abolish the Scottish Parliament and privatise the NHS. The party of Brexit which has increased the cost-of-living creating less money for public services.

    And Anas Sarwar’s latest gambit just raises more questions about trust in Labour. He’s now pledging to protect SNP policies like free tuition, free prescriptions and the Scottish Child Payment.

    After months of accusing the SNP government of ’18 years of failure’ he’s now saying it has been 18 years of “successes”.

    But why should anyone trust what many see as a panicked announcement by Anas Sarwar?

    On several occasions Labour’s Holyrood group of MSPs have voted against SNP government budgets which contained those policies. Even now they are not supporting the SNP budget containing those policies.

    A previous Scottish Labour leader notoriously called those policies a ‘something for nothing‘ culture which should end.

    Anas Sarwar’s health spokesperson, Jackie Baillie, is on record as saying prescription charges should “absolutely” be abolished.

    As for tuition fees it was only in February 2024 that Sarwar’s finance spokesperson, Michael Marra, said backdoor tuition fees, like endowments, would have to be considered.

    Shortly after Labour MSPs voted with the Tories in Holyrood against free tuition.

    And let’s not forget the behaviour of Anas Sarwar’s boss, Keir Starmer. In 2020 he promised Labour members in the party leadership election that he would “support the abolition of tuition fees”.

    Yet by September 2023 he claimed it would be ‘impossible‘ to abolish tuition fees … despite the fact that is the reality in Scotland.

    And let’s not forget which party first introduced tuition fees – whose policy they ultimately are.

    Just weeks before the 1997 election Tony Blair pledged: “Labour has no plans to introduce tuition fees for higher education.”

    A year after taking power, Blair went ahead and introduced tuition fees.

    It all just shows how the people of Scotland don’t and can’t trust any promise by Scottish Labour. Like a branch office they will always follow their bosses in Westminster.

    There’s only one party that Scots can trust to stand up and speak for Scotland. Speak out about Westminster ignoring your communities when it comes to investment. To vote for the benefit of Scotland’s pensioners, families and workers – the SNP.

    MIL OSI United Kingdom –

    February 18, 2025
←Previous Page
1 … 730 731 732 733 734 … 1,010
Next Page→
NewzIntel.com

NewzIntel.com

MIL Open Source Intelligence

  • Blog
  • About
  • FAQs
  • Authors
  • Events
  • Shop
  • Patterns
  • Themes

Twenty Twenty-Five

Designed with WordPress