United States Vice President JD Vance made headlines this week by refusing to sign a declaration at a global summit in Paris on artificial intelligence.
In his first appearance on the world stage, Vance made clear that the U.S. wouldn’t be playing ball. The Donald Trump administration believes that “excessive regulation of the AI sector could kill a transformative industry just as it’s taking off,” he said. “We’ll make every effort to encourage pro-growth AI policies.”
But upon a closer look, events this week point to signs that just the opposite may be unfolding. A host of nations took notable steps towards address growing safety and environmental concerns about AI, indicating that a regulatory tipping point has been reached.
Prime Minister Justin Trudeau delivered the keynote address at the AI Action Summit in Paris, France.
The Paris communiqué calls for an “inclusive approach” to AI, seeking to “narrow inequalities” in AI capabilities among countries. It encourages “avoiding market concentration” and affirms the need for openness and transparency in building and sharing technology and expertise.
The document is not binding. It does little more than tout principles, or affirm a collective sentiment among the parties. One of these — perhaps the most important — is to keep talking, meeting and working together on the common concerns that AI raises.
While nothing is binding on the parties, the goals are notably specific. They include coming up with standards for measuring AI’s environmental impact and more effective ways for companies to report on the impact. Parties also aim to “optimize algorithms to reduce computational complexity and minimize data usage.”
Even if most of this turns out to be merely aspirational, it’s important that the coalition offers a platform for collaboration on these initiatives. At the very least, it signals a likelihood that sustainability will be at the forefront of debate about AI moving forward.
The convention commits parties to pass domestic laws on AI that deal with privacy, bias and discrimination, safety, transparency and environmental sustainability.
The treaty has been criticized for containing no more than “broad affirmations” and imposing few clear obligations. But it does show that countries are committed to passing law to ensure that AI development unfolds within boundaries — and they’re eager to see more countries do the same.
If Canada were to ratify the treaty, Parliament would likely revive Bill C-27, which contained the AI and Data Act.
The act aimed to do much of what Canada agrees to do under the convention: impose greater oversight of the development and use of AI. This includes transparency and disclosure requirements on AI companies, and stiff penalties for failure to comply.
What does this really mean?
While the U.S. signed the convention on AI and human rights, democracy and rule of law in the fall of 2024, it likely won’t be implemented by a Republican Congress. The same might happen in Canada under a Conservative government led by Pierre Poilievre. He could also decide not to fulfil commitments made under other agreements about AI.
The Trump administration may have ushered in a period of more lax tech regulation in the U.S., and Silicon Valley is indeed a key player in tech — especially AI. But it’s a wide world, with many other important players in this space, including China, Europe and Canada.
The events in Paris have revealed a strong interest among nations around the globe to regulate AI, and specifically to foster ideas about inclusion and sustainability. If the Paris summit was any indication, the hope of sheltering AI from effective regulation won’t last long.
Robert Diab does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
Check against delivery.
1 Introduction
Ladies and gentlemen, It’s a pleasure and an honour for me to speak here before such a distinguished audience.
Remember to look up at the stars and not down at your feet. This was advice from Stephen Hawking, the famous English physicist and author of numerous books on the cosmos. And who would want to contradict the genius?
So today I invite you to join me on a stargazing tour. If you don’t have a telescope with you, no worries. However, I should add a disclaimer here: When a couple look up at the stars, things could get romantic. When astronomers observe the stars, impressive images can come into view. When economists talk about stars, it usually gets complicated. Now you know what you’re getting into!
I’m sure you’ve already guessed what topic I have in mind: the natural rate of interest – also known as r-star. It is a concept that economists have been grappling with for more than 125 years.[1] And it has perhaps never received more attention than in the current era of monetary policy.
From a central banker’s perspective, I would like to discuss what role r-star can and should play in the monetary policy universe. I will structure my lecture around four key questions: What is r-star and why is it of interest for monetary policy? How have estimates for r-star evolved over the past decades? What drives uncertainty about current estimates and the future evolution of r-star? What conclusions should monetary policy draw from this?
2 Definition of r-star and use for monetary policy
Let’s start with the definition. The natural rate is the real interest rate that would prevail if the economy were operating at its potential and prices were stable. R-star is commonly thought to be driven by real forces that structurally affect the balance between saving and investment. Think of technological progress and demographics, for example. This also means that r-star should, by definition, be independent of monetary policy. The latter follows from the widely held belief that monetary policy can affect real variables only temporarily, but is neutral in the long term.
At first glance, the natural rate could be a guiding star for the conduct of monetary policy. If a central bank sets its policy rates so that the real interest rate is above r-star, monetary policy is restrictive or “tight”. Consequently, economic activity slows and the inflation rate should decrease. If the real rate is below r-star, monetary policy is expansionary or “loose”. It provides incentives for consumers to purchase more and for enterprises to step up investment and output. Hence, this should result in more economic activity and a higher inflation rate.
However, the idea of the natural rate serving as a guiding star for monetary policy comes with profound challenges. Perhaps the name r-star evokes associations with astronomy and navigation. But these would be misleading. If r-star were like a star in the sky, it would be relatively easy to locate. Stars emit light and are therefore observable.
The natural rate is a theoretical concept. It is based on a hypothetical state of the world. That means the natural rate is, by nature, unobservable. It can only be estimated. For example, models use assumptions about the relationship between measurable variables and r-star. In this respect, the natural rate is not so much like a star shining brightly in the sky. It is more a case of dark matter. As it is invisible, astronomers infer dark matter indirectly by observing its gravitational effects.
If something is hard to find, it only spurs researchers to look even harder – whether they are astronomers or economists. Therefore, we can draw on a variety of estimation methods for the evolution of the natural rate.
3 Estimates for r-star over time
Since around the 1980s various estimates of different types have been pointing to a downward trend for r-star over several decades and across many advanced economies.[2] In the wake of the global financial crisis, the estimates slumped to exceptionally low levels.[3] This development was roughly in line with the observed trajectory of actual real interest rates of short- and long-term government bonds during this period. And no wonder: In the long run, both should be driven by the same fundamental forces affecting the balance between saving and investment.
So the question is this: what has lifted saving and depressed investment? A simple answer would be: in the long term, the most important driver is potential growth. But this finding is not very enlightening. Potential growth is also not observable. It is determined by underlying forces such as demographics and technological progress. This is where we need to look for the causes.
Indeed, according to a number of recent studies, waning productivity growth and population ageing were the key factors in pushing saving up and investment down.[4] Lower productivity reduces the return on investment, so people are less willing to invest. As they expect to live longer, they are more willing to save.
In addition, inequality, risk aversion and fiscal policy could be other factors. For example, growing inequality raises saving, as richer households save a larger share of their income. Similarly, higher risk aversion leads to higher saving, especially in safe assets, while lowering investment.[5]
Many of the estimates for r-star reached their lowest point in the pandemic years 2020 and 2021. After that, there were signs of a partial reversal. A recent analysis by Eurosystem economists across a suite of models and data up to the end of 2024 suggests that estimates of r-star range from − ½ % to ½ % in real terms. In nominal terms, they find that it ranges between 1¾ % and 2¼ %.[6]
It is clear that these ranges depend on the estimating approaches considered. Taking into account an even wider array of measures, Bundesbank staff calculations using data up to the end of 2024 reveal a range of 1.8 % to 2.5 %.[7] And the ECB found for the third quarter of 2024: When three estimates derived from versions of the Holston-Laubach-Williams model are factored in, the range of real r-star is − ½ % to 1 % and the nominal range is 1¾ % to 3 %.
All in all, the results suggest that the range of r-star estimates most likely increased by about one percentage point from their lows. The latest estimates by economists from the Bank for International Settlements come to similar findings.[8]
The reasons for the increase after the pandemic are not yet fully clear. For example, high fiscal spending with rising public debt levels could play a role. Or higher needs for capital, as companies make their value chains more resilient by duplicating structures and increasing stock levels.
4 Uncertainties around r-star estimates
Stargazing tours in economics are a journey into the uncertain. This is also and especially true for r-star. Estimates of the natural rate of interest are subject to major uncertainties, shaped by three M’s: megatrends, methodology and monetary policy.
First, we are facing a number of megatrends. Think of climate change, ageing societies, digitalisation, and the risks of de-globalisation and increasing geopolitical divisions. The effects of these megatrends on natural rates are difficult to gauge and may change over time.
On the one hand, they could contribute to a higher natural rate. Here are some examples: The widespread uptake of artificial intelligence could boost productivity growth. The green transition could lead to higher investment. Fiscal deficits could persist at an elevated level due to higher defence spending given geopolitical tensions. The entry of the baby boomer generation into retirement could reduce savings.
On the other hand, life expectancy is predicted to keep rising; the high hopes for the productivity-enhancing effect of AI could turn out to be too optimistic; and given high public debt levels, fiscal space for additional spending is limited in many countries. Overall, it is virtually impossible to predict which developments will prevail in affecting r-star.
The second factor of uncertainty is methodology. The methods used to define and estimate r-star differ in important ways, especially in terms of time and risk.
Ricardo Reis demonstrates this impressively in a recent paper.[9] He presents four different “r-stars”. They are based on four different conceptual approaches. And they developed quite differently between 1995 and 2019.
One major difference is the risk dimension. Knut Wicksell’s original definition of the natural rate was the rate of return on physical capital in equilibrium.[10] The rate of return on physical capital is the return on investment in the real economy. And this rate is very much associated with risks.
However, this perspective has been lost in virtually all of the model approaches. Generally, they use rather secure government bond yields as a starting point. Again, with regard to the real economy, a risky return on capital would be a more appropriate yardstick. When we look at measures for the return on private capital, we see a strong contrast with risk-free rates. Returns on private capital have remained broadly stable over the last decades in the US,[11] Germany[12] and the euro area as a whole.[13]
From these observations, Ricardo Reis draws the following conclusion: focusing exclusively on the return on government bonds as the measure of r-star, while neglecting the return on private capital, leads to the wrong policy advice.[14]
Another case in point is the time horizon that is considered. Commonly cited estimates seek to assess the real rate that prevails in the longer run, when all shocks have dissipated. Most of these estimates are highly imprecise. Many methods simply project the current or the historical level of real rates into the future. This may confound permanent trends with cyclical factors, which may not be representative for the future. As a result, such methods could miss important turning points in real rate trends.
Other approaches characterise a short-run real rate in a hypothetical world without frictions. While interesting, this concept is of limited value for actual policymaking in the real world. Methods based on a short-term equilibrium tend to produce more volatile estimates of r-star.
There is a third reason for caution: monetary policy itself may play a role in shaping the natural rate or its estimates. A number of studies challenge the view that money is neutral in the long run.[15]
There are different channels through which monetary policy could have lasting effects on real interest rates. Prolonged tight monetary policy, for example, may lower investment, innovation and productivity growth.[16] By contrast, persistent monetary easing could fuel financial imbalances and contribute to zombification.[17]
Moreover, recent research suggests that central bank announcements provide guidance about the trend in real rates. For instance, a narrow window around Fed meetings captures most of the trend decline in US real long-term yields since 1980.[18] This could mean: when central banks look for r-star in financial market prices, they might actually be looking in a mirror.[19] Feedback loops between monetary policy and markets could unduly reinforce their perceptions about r-star. And shifts in perceived r-star could affect actual r-star as it influences saving and investment decisions.
5 Conclusions for monetary policy
Against the backdrop of these major uncertainties, the final key question of my speech is this: what role can and should r-star play for monetary policy in practice?
Let’s approach the answer with a thought experiment: Put yourself in the shoes of a monetary policymaker who only looks at r-star. The relevant interest rate with which you steer the monetary policy stance is currently 2.75 %. After a previous series of interest rate cuts, you consider whether a further cut would be appropriate.
Your staff inform you that various point estimates of r-star range from around 1.8 % to 2.5 % in nominal terms. If r-star were at the upper end of the estimates, the policy rate would become neutral with the next rate cut. Things would be different if r-star were at the lower end of the estimates: Monetary policy would continue to be restrictive, even after several further rate cuts.
So how would you proceed, given a certain stance you want to achieve? Beware: If you rely on a wrong estimate, your decision may have a different effect on inflation than you intended. Simply choosing the middle of the range might not be a happy medium. Around the point estimates, there are often uncertainty bands of different sizes and with asymmetries.
As you have probably guessed: It is no coincidence that I have described this particular decision-making situation. It looks similar in the euro area ahead of the next monetary policy meeting of the ECB Governing Council at the beginning of March. After several rate cuts, the neutral rate could already be near – or there may still be some way to go.
The President of the New York Fed, John Williams, put the problem in a nutshell when he said: as we have gotten closer to the range of estimates of neutral, what appeared to be a bright point of light is really a fuzzy blur.[20]
The bottom line here is this: The closer we get to the neutral rate, the more appropriate it becomes to take a gradual approach. For this purpose, r-star is a helpful concept: it indicates when we need to be more cautious with policy rate moves so that we don’t take a wrong step.
At the same time, the limits of the concept are also clear: it would be risky to base decisions mainly on r-star estimates. Much more is needed to assess the current monetary policy stance and the optimal policy path for the near future.
That is why the Eurosystem uses a variety of financial, real economic and other indicators along the monetary policy transmission mechanism. We want the fullest picture possible. And, of course, r-star also has a place in this picture. For instance, r-star is included in model-based optimal policy projections that we use in the decision-making process.
In my opinion, proceeding in a data-driven and gradual manner has served the ECB Governing Council well. There is no reason to act hastily in the present uncertain environment. The data will tell us where we need to go.
Away from day-to-day monetary policymaking, the concept of the natural rate of interest provides a useful framework. This is also exemplified in the policy scenarios that Ricardo Reis presented last week in Brussels.[21]
He works with the assumption that government bond rates remain around current levels. I would add the assumption that inflation stays on target – actually, that is what I am in office for and committed to. Assuming output is at capacity, policy rates would be persistently higher than in the past. But the recommendations on actual monetary policy depend on the driving forces: is the new setting caused by less demand for safe and liquid assets or by an increase in productivity? And he has two more scenarios in his paper!
That provides a good example of why we should take a close look at the factors behind r-star estimates. Here it is important to even better understand the forces that are shifting real interest rate trends. We need to find out how these forces and trends affect our work to ensure price stability.
Reviewing our monetary policy strategy from time to time is therefore vital. That is precisely what we are doing right now in the Eurosystem. And, of course, in this process, we look at all the questions I mentioned about r-star.
Our stargazing tour is drawing to a close. It turns out we were dealing more with dark matter than with a shining star. Just as dark matter is an exciting field for astronomers, r-star is a rewarding topic for economists.
Using r-star alone to navigate the monetary policy universe could be like flying almost blind. But having it as one of many instruments in your cockpit is highly useful.
I would like to end by quoting Stephen Hawking again: Mankind’s greatest achievements have come about by talking, and its greatest failures by not talking.
Footnotes:
Wicksell, K. (1898), Geldzins und Güterpreise: eine Studie über die den Tauschwert des Geldes bestimmenden Ursachen, Jena, G. Fischer (English version as ibid. (1936), Interest and prices: a study of the causes regulating the value of money, London, Macmillan).
Obstfeld, M., Natural and Neutral Real Interest Rates: Past and Future, NBER Working Paper, No 31949, December 2023.
Brand, C., M. Bielecki and A. Penalver (2018), The natural rate of interest: estimates, drivers, and challenges to monetary policy, ECB Occasional Paper, No 217.
Cesa-Bianchi, A., R. Harrison and R. Sajedi (2023), Global R*, CEPR Discussion Paper No 18518; Davis, J., C. Fuenzalida, L. Huetsch, B. Mills and A. M. Taylor (2024), Global natural rates in the long run: Postwar macro trends and the market-implied r* in 10 advanced economies, Journal of International Economics, Vol. 149; International Monetary Fund (2023), The natural rate of interest: drivers and implications for policy, World Economic Outlook, April, Chapter 2.
On the development of risk appetite in financial markets, see Deutsche Bundesbank, Risk appetite in financial markets and monetary policy, Monthly Report, January 2025.
Brand, C., N. Lisack and F. Mazelis (2025), Natural rate estimates for the euro area: insights, uncertainties and shortcomings, ECB Economic Bulletin, 1/2025.
Additional models would also provide values outside this range, but are currently not deemed sufficiently robust.
Benigno, G., B. Hofmann, G. Nuño and D. Sandri (2024), Quo vadis, r*? The natural rate of interest after the pandemic, BIS Quarterly Review, March.
Reis, R. (2025), The Four R-stars: From Interest Rates to Inflation and Back, draft working paper.
Wicksell, K. (1898), op. cit.
Caballero, R., E. Farhi and P.-O. Gourinchas (2017), Rents, Technical Change, and Risk Premia Accounting for Secular Trends in Interest Rates, Returns on Capital, Earning Yields, and Factor Shares, American Economic Review: Papers & Proceedings 107(5), pp. 614‑620.
Deutsche Bundesbank, The natural rate of interest, Monthly Report, October 2017.
Brand, C., M. Bielecki and A. Penalver (2018), The natural rate of interest: estimates, drivers, and challenges to monetary policy, ECB Occasional Paper, No 217.
Reis, R., Which r-star, public bonds or private investment? Measurement and policy implications, Unpublished manuscript, September 2022.
Jordà, Ò., S. Singh and A. Taylor, The long-run effects of monetary policy, NBER Working Papers, No 26666, January 2020, revised September 2024; Benigno, G., B. Hofmann, G. Nuño and D. Sandri (2024), Quo vadis, r*? The natural rate of interest after the pandemic, BIS Quarterly Review, March.
Baqaee, D., E. Farhi and K. Sangani, The supply-side effects of monetary policy, NBER Working Paper, No 28345, January 2021, revised March 2023; Ma, Y. and K. Zimmermann, Monetary Policy and Innovation, NBER Working Paper, No 31698, September 2023.
Borio, C., P. Disyatat, M. Juselius and P. Rungcharoenkitkul (2022), Why so low for so long? A long-term view of real interest rates, International Journal of Central Banking, Vol. 18, No 3.
Hillenbrand, S. (2025), The Fed and the Secular Decline in Interest Rates, The Review of Financial Studies, forthcoming.
Williams, J. C. (2017), Comment on “Safety, Liquidity, and the Natural Rate of Interest”, by M. Del Negro, M. P. Giannoni, D. Giannone, and A. Tambalotti, Brookings Papers on Economic Activity, Vol. 1, pp. 235‑316; Rungcharoenkitkul, P. and F. Winkler, The natural rate of interest through a hall of mirrors, BIS Working Paper No 974, November 2021.
Williams, J. C., Remarks at the 42nd Annual Central Banking Seminar, Federal Reserve Bank of New York, New York City, 1 October 2018.
Reis, R. (2025), op. cit.
Professor Ian Graham will rejoin the Board for a second term.
Professor Ian Graham has been reappointed to the board of Royal Botanic Gardens, Kew for a second term of three years.
His term will run from 1 May 2025 to 30 April 2028.
The reappointment has been made in accordance with the Governance Code on Public Appointments.
Biography
Professor Graham is currently based at the University of York, in the Centre for Novel Agricultural Products and holds the Weston Chair in Biochemical Genetics. He has previously held roles in the University of Glasgow, University of Oxford, and Stanford University.
Professor Ian Graham completed his PhD in Plant Molecular Biology from the University of Edinburgh. His research interests now focus on plant natural products such as noscapine (anti-cancer), codeine (analgesic), and artemisinin (antimalarial).
Ian was elected as a Fellow of the Royal Society in 2016 and won the Biochemical Society’s 2017 Heatley Medal and Prize for “exceptional work in applying advances in biochemistry, and especially for developing practical uses that have created widespread benefits and value for society”.
The Royal Botanic Gardens, Kew
The Royal Botanic Gardens, Kew is a world-famous scientific organisation, internationally respected for its outstanding collections as well as its scientific expertise in plant and fungal diversity, conservation and sustainable development in the UK and around the world.
Kew Gardens is a major international and a top London visitor attraction. Kew Gardens’ 132 hectares of landscaped gardens, and Wakehurst, Kew’s wild botanic garden in Sussex, attract over 2.5 million visits every year. Kew Gardens was made a UNESCO World Heritage site in July 2003 and celebrated its 260th anniversary in 2019. Wakehurst is home to Kew’s Millennium Seed Bank, the largest wild plant seed bank in the world, as well as over 500 acres of designed landscapes, wild woodlands, ornamental gardens and a nature reserve.
The Kew Madagascar Conservation Centre is Kew’s third research centre and only overseas office. RBG Kew receives approximately one third of its funding from government through the Department for Environment, Food and Rural Affairs and research councils. Further funding needed to support RBG Kew’s vital work comes from donors, membership and commercial activity including ticket sales.
Source: The Conversation – UK – By David Benoit, Senior Lecturer in Molecular Physics and Astrochemistry, University of Hull
A detector on the seabed near Toulon, France, has spotted a high energy neutrino.ivan bastien/Shutterestock
Recent research on lightweight particles called neutrinos might have passed you by – much like the more than 10 trillion neutrinos passing through your body each second. Now, our new paper – with 21 countries, more than 60 institutes and around 360 scientists contributing – reports the observation of the most energetic neutrino yet.
Despite the enormous number of neutrinos around us, this is one of the most exciting – and rarest – astronomical events of the year. Our paper has been published in the journal Nature.
Neutrinos are tiny elementary (sub-atomic) particles that are abundant in our universe. Yet, you probably haven’t seen any. They do not interact with other matter in the ways we are familiar with.
Their lack of charge, for example, means that the electrostatic force that governs most of our everyday experiences does not interact with them at all. And their vanishingly small mass means that gravity – the other major force we experience – also has no effect on them in lab conditions on Earth.
So, detecting their presence is challenging to say the least.
They are formed through the actions of the weak nuclear force, which governs radioactive decay. It is this force that enables positively charged particles called protons, which make up to atomic nucleus, to change into neutrons, neutrally charged particles which also exist in the atomic nucleus, and vice versa.
We cannot detect a neutrino directly. But, every now and then (although very rarely), they might bump into something. When that happens, through the action of this weak nuclear force, a charged particle, such as an electron, may be created – seemingly out of nowhere – that we can detect.
Those charged particles travel at enormous speeds. And when they move through a medium such as water, they create an eerie, faint blue glow as they are slowed down. This event, called the Cherenkov effect, also happens in nuclear reactor containment pools.
How likely (or unlikely) are these interactions? Well, you would have to flip 75 heads in a row on a fair coin to have the same probability of a single neutrino interacting with a particle of matter. Think this is easy? Go ahead and flip them. It’ll take a while.
Under the sea
The KM3NeT telescope collaboration uses this Cherenkov effect to scrutinise the depths of the Mediterranean Sea for the telltale faint glow of those neutrino events. They operate two huge detection stations – one off the shores of Toulon, France and one off the southern coast of Sicily. Scientists keep watch for events around the clock.
The scale of those detectors is gigantic, as are most neutrino detectors, since the only way of spotting the elusive neutrino collision is to try to increase the amount of matter that the neutrino can interact with. In fact, the KM3 part of the KM3NeT acronym stands for the kilometre cube (KM3) of seawater that the detector will be surveying when completed.
The detection stations themselves each consist of nearly 600 light detectors – spherical buoys each containing 31 light sensing tubes, which are attached to cables anchored to the seabed up to 3.5km below the surface.
The particle described in our recent paper was detected on February 13 2023. And you might wonder why the long wait? The intervening time has been spent by collaborators across Europe verifying and simulating the detection to confirm the nature of the event. After months of work by the KM3NeT team, we can finally say that this is the most energetic observation of a neutrino interaction ever recorded.
About 28,000 photons (light particles) were detected across the array in Sicily, indicating that a hugely energetic event had just happened. That said, an average 75W lightbulb generates millions and millions of photons every second (about 100 quintillion to be more precise). But while these few thousands of photons might appear to be a small event, remember that this has been generated by a single particle.
In fact, the energy of the neutrino responsible for such bright display was estimated to be 220 peta-electronvolts (PeV) or 30 times more energetic than the highest-energy neutrino recorded so far. In terms of particle energies, it is around 1,000 times more energetic than the particles generated at Cern, the most energetic accelerator facility in the world.
The light generated by this record-breaking event could be followed through the detector array and our collaboration was able to use it to reconstruct the near-horizontal trajectory of this high-energy neutrino. The path taken indicates that this neutrino is of cosmic origin.
We don’t know exactly where it comes from, but we’ve identified 12 potential blazars (bright cores of active galaxies) that may have produced it. It is also possible that it was created in the interaction of cosmic rays with photons from the cosmic energy background.
This detection provides a window into the ultra-high-energy phenomena happening in the universe and could, for example, help us better understand the nature of some of the most energetic cosmic rays. Moreover, the observation can help us further test the theoretical models that predict the existence of high-energy neutrinos.
David Benoit receives funding from the European Union, the Science and Technology Facilities Council and the UKRI National Quantum Computing Centre.
James Keegans receives funding from the European Union.
Source: The Conversation – UK – By Blane Savage, Lecturer in MA Creative Media Practice and BA(Hons) Graphic Art & Moving Image, University of the West of Scotland
The exhibition curator James Knox is to be congratulated on bringing together an impressive collection of work that tells the story of a diverse group of artists who helped transform and modernise British art in the early 20th century and contains work held in private collections not seen by the public before.
The Scottish colourists, as they were known, all visited and lived in Paris and were heavily influenced by the burgeoning avant-garde movement there in the early years of the 20th century. This was during its most dynamic and transformative stages, when cubism, post-impressionism and fauvism movements were evolving.
The exhibition highlights and contrasts the work produced by the colourists to that of Roger Fry’s Bloomsbury group members, Vanessa Bell and her amour Duncan Grant. It also includes work by the Fitzroy Street Group and several distinguished Welsh artists of that time, Augustus John and James Dickson Innes, as well as fauvist artists Andre Derain and Kees van Dongen.
The colourists’ paintings stand out in the exhibition through the maturity and confidence of their artworks, the tonal qualities and vibrancy of their colour palettes consistently rising above the more muted works surrounding them.
Looking for something good? Cut through the noise with a carefully curated selection of the latest releases, live events and exhibitions, straight to your inbox every fortnight, on Fridays. Sign up here.
The capacity of the colourists to study, travel and seek inspiration internationally, away from a grey Scottish Presbyterian climate, and particularly, embedding themselves in the Paris art scene in the early 20th century is impressive.
These artists stood shoulder to shoulder with their European contemporaries, inspired by the post-impressionist work of Cezanne, Matisse, Van Gogh and Derain. They delivered consistent and highly sophisticated artworks throughout their careers exploring light, shape and dynamic colour ranges, and often painted outdoors.
Each of the Scottish colourists returned to Scotland bringing new approaches to art with them. Peploe experimented with Cezanne-like geometric forms, whereas Fergusson’s practice was heavily influenced by the fauves. Hunter experimented with simplified post-impressionist blocks of colour to create dynamic shapes, while Cadell often focused on bold shapes and stylish impressionistic compositions.
Peploe, Hunter and Cadell exhibited in London’s Leicester Gallery in 1923 where they were first described as the “three colourists” by critic P.G. Konody.
Peploe, Fergusson and Hunter’s reputations were enhanced in 1924 when their work was bought by the French state after an exhibition organised by one of the most influential art dealers in Europe, Glaswegian Alexander Reid. He represented the four artists at the Galerie Barbazanges in Paris entitled Les Peintres de L’Ecosse Moderne, and turned their loose affiliation into an art movement.
Reid had also been responsible for developing the profile of The Glasgow Boys – a group of radical young painters whose disillusionment with academic painting signalled the birth of modernism in Scotland in the late 19th century. Reid was also a central figure in developing Sir William Burrell’s art collection. This was closely followed by a further exhibition in London’s Leicester Gallery in 1925 and then in Paris in 1931.
Peploe was the most commercially successful of the four artists, having a still life purchased by the Tate in 1927. His painting of Paris Plage captures the atmospherically startling white light of that French region. His studio work with a still life of flowers and fruit had the hallmarks of Cezanne’s style.
His love of outdoor landscapes, as shown in Kirkcudbright, painted in south-west Scotland, also resemble Cezanne’s primary geometric forms. He visited the island of Iona on a number of occasions with Cadell and other painters, revealing his love of the white sands, rocks and water which can be seen in Green Sea, Iona.
Cadell was known for his powerful still lifes, stylish portraits of elegant women in hats, and for his landscape painting on Iona. Cadell’s Green Sea on Iona and Ben More on Mull on show are part of a series of paintings of the white sands he produced on his regular visits there.
J.D. Fergusson‘s The Blue Hat, Closerie de Lilas is an outstanding piece on show which dazzles with the vibrancy of Parisian cafe life. He was attracted to fauve-like expressive colours and strong outlines in his work. The one piece of sculpture on display is by Fergusson, whose foray into sculptural medium in the Eastre, Hymn to the Sun is striking in its modernist aesthetic – like the female robot character in Fritz Lang’s Metropolis.
Having no art training like the others, Lesley Hunter’s Still Life with White Jug and Peonies in a Chinese vase highlight his developing skills as a still life painter and they have a striking vibrancy to them. His outdoor scenes use loosely styled daubs of colour in a post-impressionistic style often in vibrant colours.
All the Scottish colourists were recognised for their influence and contribution to the development of Scottish art during their lifetimes, combining aspects of The Glasgow School and cutting-edge Parisian avant garde. But they fell out of fashion due to economic decline before the second world war.
They were rediscovered and packaged as a collective in the 1950s initially by art historian T.J. Honeyman in his book Three Scottish Colourists and were brought together with the inclusion of J.D. Fergusson in the 1980s. Although their key role in the development of Scottish art history is assured, interestingly their appreciation in France is even greater than in Britain.
The Scottish Colourists: Radical Perspectives is on at the Dovecot Studios in Edinburgh until June 28.
Blane Savage does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
We are in the early days of a seismic shift in the global AI industry. DeepSeek, a previously little-known Chinese artificial intelligence company, has produced a “game changing”“ large language model that promises to reshape the AI landscape almost overnight.
But DeepSeek’s breakthrough also has wider implications for the technological arms race between the US and China, having apparently caught even the best-known US tech firms off guard. Its launch has been predicted to start a “slow unwinding of the AI bet” in the west, amid a new era of “AI efficiency wars”.
In fact, industry experts have been speculating for years about China’s rapid advancements in AI. While the supposedly free-market US has often prioritised proprietary models, China has built a thriving AI ecosystem by leveraging open-source technology, fostering collaboration between government-backed research institutions and major tech firms.
This strategy has enabled China to scale its AI innovation rapidly while the US – despite all the tub-thumping from Silicon Valley – remains limited by restrictive corporate structures. Companies such as Google and Meta, despite promoting open-sourceinitiatives, still rely heavily on closed-source strategies that limit broader access and collaboration.
What makes DeepSeek particularly disruptive is its ability to achieve cutting-edge performance while reducing computing costs – an area where US firms have struggled due to their dependence on training models that demand very expensive processing hardware.
Where once Silicon Valley was the epicentre of global digital innovation, its corporate behemoths now appear vulnerable to more innovative, “scrappy” startup competitors – albeit ones enabled by major state investment in AI infrastructure. By leveraging China’s industrial approach to AI, DeepSeek has crystallised a reality that many in Silicon Valley have long ignored: AI’s centre of power is shifting away from the US and the west.
It highlights the failure of US attempts to preserve its technological hegemony through tight export controls on cutting-edge AI chips to China. According to research fellow Dean Ball: “You can keep [computing resources] away from China, but you can’t export-control the ideas that everyone in the world is hunting for.”
DeepSeek’s success has forced Silicon Valley and large western tech companies to “take stock”, realising that their once-unquestioned dominance is suddenly at risk. Even the US president, Donald Trump, has proclaimed that this should be a “wake-up call for our industries that we need to be laser-focused on competing”.
But this story is not just about technological prowess – it could mark an important shift in global power. Former US secretary of state Mike Pompeo has framed DeepSeek’s emergence as a “shot across America’s bow”, urging US policymakers and tech executives to take immediate action.
DeepSeek’s rapid rise underscores a growing realisation: globally, we are entering a potentially new AI paradigm, one where China’s model of open-source innovation and state-backed development is proving more effective than Silicon Valley’s corporate-driven approach.
The Insights section is committed to high-quality longform journalism. Our editors work with academics from many different backgrounds who are tackling a wide range of societal and scientific challenges.
I’ve spent much of my career analysing the transformative role of AI on the global digital landscape – examining how AI shapes governance, market structures and public discourse, and exploring its geopolitical and ethical dimensions, now and far in the future.
I also have personal connections with China, having lived there while teaching at Jiangsu University, then written my PhD thesis on the country’s state-led marketisation programme. Over the years, I have studied China’s evolving tech landscape, observing firsthand how its unique blend of state-driven industrial policy and private-sector innovation has fuelled rapid AI development.
I believe this moment may come to be seen as a turning point not just for AI, but for the geopolitical order. If China’s AI dominance continues, what could this mean for the future of digital governance, democracy, and the global balance of power?
China’s open-source AI takeover
Even in the early days of China’s digital transformation, analysts predicted the country’s open-source focus could lead to a major AI breakthrough. In 2018, China was integrating open-source collaboration into its broader digitisation strategy, recognising that fostering shared development efforts could accelerate its AI capabilities.
Unlike the US, where proprietary AI models dominated, China embraced open-source ecosystems to bypass western gatekeeping, scale innovation faster, and embed itself in global AI collaboration. China’s open-source activity surged dramatically in 2020, laying the foundation for the kind of innovation seen today. By actively fostering an open-source culture, China ensured that a broad range of developers had access to AI tools, rather than restricting them to a handful of dominant companies.
The trend has continued in recent years, with China even launching its own state-backed open-source operating systems and platforms in 2023, to further reduce its dependence on western technology. This move was widely seen as an effort to cement its AI leadership and create an independent, self-sustaining digital ecosystem.
Video: BBC.
While China has been steadily positioning itself as a leader in open-source AI, Silicon Valley firms remained focused on closed, proprietary models – allowing China to catch up fast. While companies like Google and Meta promoted open-source initiatives in name, they still locked key AI capabilities behind paywalls and restrictive licenses.
In contrast, China’s government-backed initiatives have treated open-source AI as a national resource, rather than a corporate asset. This has resulted in China becoming one of the world’s largest contributors to open-source AI development, surpassing many western firms in collaborative projects. Chinese tech giants such as Huawei, Alibaba and Tencent are driving open-source AI forward with frameworks like PaddlePaddle, X-Deep Learning (X-DL) and MindSpore — all now core to China’s machine learning ecosystem.
But they’re also making major contributions to global AI projects, from Alibaba’s Dragonfly, which streamlines large-scale data distribution, to Baidu’s Apollo, an open-source platform accelerating autonomous vehicle development. These efforts don’t just strengthen China’s AI industry, they embed it deeper into the global AI landscape.
This shift had been years in the making, as Chinese firms (with state backing) pushed open-source AI forward and made their models publicly available, creating a feedback loop that western companies have also – quietly – tapped into. A year ago, for example, US firm Abicus.AI released Smaug-72B, an AI model designed for enterprises that built directly upon Alibaba’s Qwen-72B and outperformed proprietary models like OpenAI’s GPT-3.5 and Mistral’s Medium. But the potential for US companies to further build on Chinese open-source technology may be limited by political as well as corporate barriers.
In 2023, US lawmakers highlighted growing concerns that China’s aggressive investment in open-source AI and semiconductor technologies would eventually erode western leadership in AI. Some policymakers called for bans on certain open-source chip technologies, due to fears they could further accelerate China’s AI advancements.
But by then, China’s AI horse had already bolted.
AI with Chinese characteristics
DeepSeek’s rise should have been obvious to anyone familiar with management theory and the history of technological breakthroughs linked to “disruptive innovation”. Latecomers to an industry rarely compete by playing the same game as incumbents – they have to be disruptive.
China, facing restrictions on cutting-edge western AI chips and lagging behind in proprietary AI infrastructure, had no choice but to innovate differently. Open-source AI provided the perfect vehicle: a way to scale innovation rapidly, lower costs and tap into global research while bypassing Silicon Valley’s resource-heavy, closed-source model.
From a western and traditional human rights perspective, China’s embrace of open-source AI may appear paradoxical, given the country’s strict information controls. Its AI development strategy prioritises both technological advancement and strict alignment with the Chinese Communist party’s ideological framework, ensuring AI models adhere to “core socialist values” and state-approved narratives. AI research in China has thrived not only despite these constraints but, in many ways, because of them.
Video: CNBC.
China’s success goes beyond traditional authoritarianism; it embodies what Harvard economist David Yang calls “Autocracy 2.0”. Rather than relying solely on fear-based control, it uses economic incentives, bureaucratic efficiency, and technology to manage information and maintain regime stability.
The Chinese government has strategically encouraged open-source development while maintaining tight control over AI’s domestic applications, particularly in surveillance and censorship. Indeed, authoritarian regimes may have a significant advantage in developing facial-recognition technology due to their extensive surveillance systems. The vast amounts of data collected through these networks enable private AI companies to create advanced algorithms, which can then be adapted for commercial uses, potentially accelerating economic growth.
China’s AI strategy is built on a dual foundation of state-led initiatives and private-sector innovation. The country’s AI roadmap, first outlined in the 2017 new generation artificial intelligence development plan, follows a three-phase timeline: achieving global competitiveness by 2020, making major AI breakthroughs by 2025, and securing world leadership in AI by 2030. In parallel, the government has emphasised data governance, regulatory frameworks and ethical oversight to guide AI development “responsibly”.
A defining feature of China’s AI expansion has been the massive infusion of state-backed investment. Over the past decade, government venture capital funds have injected approximately US$912 billion (£737bn) into early-stage firms, with 23% of that funding directed toward AI-related companies. A significant portion has targeted China’s less-developed regions, following local investment mandates.
Compared with private venture capital, government-backed firms often lag in software development but demonstrate rapid growth post-investment. Moreover, state funding often serves as a signal for subsequent private-sector investment, reinforcing the country’s AI ecosystem.
China’s AI strategy represents a departure from its traditional industrial policies, which historically emphasised self-sufficiency, support for a handful of national champions, and military-driven research. Instead, the government has embraced a more flexible and collaborative approach that encourages open-source software adoption, a diverse network of AI firms, and public-private partnerships to accelerate innovation. This model prioritises research funding, state-backed AI laboratories, and AI integration across key industries including security, healthcare, and infrastructure.
Despite strong state involvement, China’s AI boom is equally driven by private-sector innovation. The country is home to an estimated 4,500 AI companies, accounting for 15% of the world’s total.
As economist Liu Gang told the Chinese Communist Party’s Global Times newspaper: “The development of AI is fast in China – for example, for AI-empowered large language models. Aided with government spending, private capital is flowing to the new sector. Increased capital inflow is anticipated to further enhance the sector in 2025.”
China’s tech giants including Baidu, Alibaba, Tencent and SenseTime have all benefited from substantial government support while remaining competitive on the global stage. But unlike in the US, China’s AI ecosystem thrives on a complex interplay between state support, corporate investment and academic collaboration.
Recognising the potential of open-source AI early on, Tsinghua University in Beijing has emerged as a key innovation hub, producing leading AI startups such as Zhipu AI, Baichuan AI, Moonshot AI and MiniMax — all founded by its faculty and alumni. The Chinese Academy of Sciences has similarly played a crucial role in advancing research in deep learning and natural language processing.
Unlike the west, where companies like Google and Meta promote open-source models for strategic business gains, China sees them as a means of national technological self-sufficiency. To this end, the National AI Team, composed of 23 leading private enterprises, has developed the National AI Open Innovation Platform, which provides open access to AI datasets, toolkits, libraries and other computing resources.
DeepSeek is a prime example of China’s AI strategy in action. The company’s rise embodies the government’s push for open-source collaboration while remaining deeply embedded within a state-guided AI ecosystem. Chinese developers have long been major contributors to open-source platforms, ranking as the second-largest group on GitHub by 2021.
Founded by Chinese entrepreneur Liang Wenfeng in 2023, DeepSeek has positioned itself as an AI leader while benefiting from China’s state-driven AI ecosystem. Liang, who also established the hedge fund High-Flyer, has maintained full ownership of DeepSeek and avoided external venture capital funding.
Though there is no direct evidence of government financial backing, DeepSeek has reaped the rewards of China’s AI talent pipeline, state-sponsored education programs, and research funding. Liang has engaged with top government officials including China’s premier, Li Qiang, reflecting the company’s strategic importance to the country’s broader AI ambitions.
In this way, DeepSeek perfectly encapsulates “AI with Chinese characteristics” – a fusion of state guidance, private-sector ingenuity, and open-source collaboration, all carefully managed to serve the country’s long-term technological and geopolitical objectives.
Recognising the strategic value of open-source innovation, the government has actively promoted domestic open-source code platforms like Gitee to foster self-reliance and insulate China’s AI ecosystem from external disruptions. However, this also exposes the limits of China’s open-source ambitions. The government pushes collaboration, but only within a tightly controlled system where state-backed firms and tech giants call the shots.
Reports of censorship on Gitee reveal how Beijing carefully manages innovation, ensuring AI advances stay in line with national priorities. Independent developers can contribute, but the real power remains concentrated in companies that operate within the government’s strategic framework.
The conflicted reactions of US big tech
DeepSeek’s emergence has sparked intense debate across the AI industry, drawing a range of reactions from leading Silicon Valley executives, policymakers and researchers. While some view it as an expected evolution of open-source AI, others see it as a direct challenge to western AI leadership.
Microsoft’s CEO, Satya Nadella, emphasised its technical efficiency. “It’s super-impressive in terms of both how they have really effectively done an open-source model that does this inference-time compute, and is super-compute efficient,” Nadella told CNBC. “We should take the developments out of China very, very seriously”.
Silicon Valley venture capitalist Marc Andreessen, a prominent advisor to Trump, was similarly effusive. “DeepSeek R1 is one of the most amazing and impressive breakthroughs I’ve ever seen – and as open source, a profound gift to the world,” he wrote on X.
For Yann LeCun, Meta’s chief AI scientist, DeepSeek is less about China’s AI capabilities and more about the broader power of open-source innovation. He argued that the situation should be read not as China’s AI surpassing the US, but rather as open-source models surpassing proprietary ones. “DeepSeek has profited from open research and open source (e.g. PyTorch and Llama from Meta),” he wrote on Threads. “They came up with new ideas and built them on top of other people’s work. Because their work is published and open source, everyone can profit from it. That is the power of open research and open source.”
Not all responses were so measured. Alexander Wang, CEO of Scale AI – a US firm specialising in AI data labelling and model training – framed DeepSeek as a competitive threat that demands an aggressive response. He wrote on X: “DeepSeek is a wake-up call for America, but it doesn’t change the strategy: USA must out-innovate & race faster, as we have done in the entire history of AI. Tighten export controls on chips so that we can maintain future leads. Every major breakthrough in AI has been American.”
Elon Musk added fuel to speculation about DeepSeek’s hardware access when he responded with a simple “obviously” to Wang’s earlier claims on CNBC that DeepSeek had secretly acquired 50,000 Nvidia H100 GPUs, despite US export restrictions.
Beyond the tech world, US policymakers have taken a more adversarial stance. House speaker Mike Johnson accused China of leveraging DeepSeek to erode American AI leadership. “They abuse the system, they steal our intellectual property. They’re now trying to get a leg up on us in AI.”
For his part, Trump took a more pragmatic view, seeing DeepSeek’s efficiency as a validation of cost-cutting approaches. “I view that as a positive, as an asset … You won’t be spending as much, and you’ll get the same result, hopefully.”
The rise of DeepSeek may have helped jolt the Trump administration into action, leading to sweeping policy shifts aimed at securing US dominance in AI. In his first week back in the White House, the US president announced a series of aggressive measures, including massive federal investments in AI research, closer partnerships between the government and private tech firms, and the rollback of regulations seen as slowing US innovation.
The administration’s framing of AI as a critical national interest reflects a broader urgency sparked by China’s rapid advancements, particularly DeepSeek’s ability to produce cutting-edge models at a fraction of the cost traditionally associated with AI development. But this response is not just about national competitiveness – it is also deeply entangled with private industry.
Musk’s growing closeness to Trump, for example, can be viewed as a calculated move to protect his own dominance at home and abroad. By aligning with the administration, Musk ensures that US policy tilts in favour of his AI ventures, securing access to government backing, computing power, and regulatory control over AI exports.
At the same time, Musk’s public criticism of Trump’s US$500 billion AI infrastructure plan – claiming the companies involved lack the necessary funding – was as much a warning as a dismissal, signalling his intent to shape policy in a way that benefits his empire while keeping potential challengers at bay.
Not unrelated, Musk and a group of investors have just launched a US$97.4 billion (£78.7bn) bid for OpenAI’s nonprofit arm, a move that escalates his feud with OpenAI CEO Sam Altman and seeks to strengthen his grip on the AI industry. Altman has dismissed the bid as a “desperate power grab”, insisting that OpenAI will not be swayed by Musk’s attempts to reclaim control. The spat reflects how DeepSeek’s emergence has thrown US tech giants into what could be all-out war, fuelling bitter corporate rivalries and reshaping the fight for AI dominance.
And while the US and China escalate their AI competition, other global leaders are pushing for a coordinated response. The Paris AI Action Summit, held on February 10 and 11, has become a focal point for efforts to prevent AI from descending into an uncontrolled power struggle. France’s president, Emmanuel Macron, warned delegates that without international oversight, AI risks becoming “the wild west”, where unchecked technological development creates instability rather than progress.
But at the end of the two-day summit, the UK and US refused to sign an international commitment to “ensuring AI is open, inclusive, transparent, ethical, safe, secure and trustworthy … making AI sustainable for people and the planet”. China was among the 61 countries to sign this declaration.
Concerns have also been raised at the summit about how AI-powered surveillance and control are enabling authoritarian regimes to strengthen repression and reshape the citizen-state relationship. This highlights the fast-growing global industry of digital repression, driven by an emerging “authoritarian-financial complex” that may exacerbate China’s strategic advancement in AI.
Equally, DeepSeek’s cost-effective AI solutions have created an opening for European firms to challenge the traditional AI hierarchy. As AI development shifts from being solely about compute power to strategic efficiency and accessibility, European firms now have an opportunity to compete more aggressively against their US and Chinese counterparts.
Whether this marks a true rebalancing of the AI landscape remains to be seen. But DeepSeek’s emergence has certainly upended traditional assumptions about who will lead the next wave of AI innovation – and how global powers will respond to it.
End of the ‘Silicon Valley effect’?
DeepSeek’s emergence has forced US tech leaders to confront an uncomfortable reality: they underestimated China’s AI capabilities. Confident in their perceived lead, companies like Google, Meta, and OpenAI prioritised incremental improvements over anticipating disruptive competition, leaving them vulnerable to a rapidly evolving global AI landscape.
In response, the US tech giants are now scrambling to defend their dominance, pledging over US$400 billion in AI investment. DeepSeek’s rise, fuelled by open-source collaboration, has reignited fierce debates over innovation versus security, while its energy-efficient model has intensified scrutiny on AI’s sustainability.
Yet Silicon Valley continues to cling to what many view as outdated economic theories such as the Jevons paradox to downplay China’s AI surge, insisting that greater efficiency will only fuel demand for computing power and reinforce their dominance. Companies like Meta, OpenAI and Microsoft remain fixated on scaling computational power, betting that expensive hardware will secure their lead. But this assumption blinds them to a shifting reality.
DeepSeek’s rise as the potential “Walmart of AI” is shaking Silicon Valley’s foundation, proving that high-quality AI models can be built at a fraction of the cost. By prioritising efficiency over brute-force computing power, DeepSeek is challenging the US tech industry’s reliance on expensive hardware like Nvidia’s high-end chips.
This shift has already rattled markets, driving down the stock prices of major US firms and forcing a reassessment of AI dominance. Nvidia, whose business depends on supplying high-performance processors, appears particularly vulnerable as DeepSeek’s cost-effective approach threatens to reduce demand for premium chips.
Video: CBS News.
The growing divide between the US and China in AI, however, is more than just competition – it’s a clash of governance models. While US firms remain fixated on protecting market dominance, China is accelerating AI innovation with a model that is proving more adaptable to global competition.
If Silicon Valley resists structural change, it risks falling further behind. We may witness the unravelling of the “Silicon Valley effect”, through which tech giants have long manipulated AI regulations to entrench their dominance. For years, Google, Meta,and OpenAI shaped policies that favoured proprietary models and costly infrastructure, ensuring AI development remained under their control.
More than a policy-driven rise, China’s AI surge reflects a fundamentally different innovation model – fast, collaborative and market-driven – while Silicon Valley holds on to expensive infrastructure and rigid proprietary control. If US firms refuse to adapt, they risk losing the future of AI to a more agile and cost-efficient competitor.
A new era of geotechnopolitics
But China is not just disrupting Silicon Valley. It is expanding “geotechnopolitics”, where AI is a battleground for global power. With AI projected to add US$15.7 trillion to the global economy by 2030, China and the US are racing to control the technology that will define economic, military and political dominance.
DeepSeek’s advancement has raised national security concerns in the US. Trump’s government is considering stricter export controls on AI-related technologies to prevent them from bolstering China’s military and intelligence capabilities.
As AI-driven defence systems, intelligence operations and cyber warfare redefine national security, governments must confront a new reality: AI leadership is not just about technological superiority, but about who controls the intelligence that will shape the next era of global power.
China’s AI ambitions extend beyond technology, driving a broader strategy for economic and geopolitical dominance. But with over 50 state-backed companies developing large-scale AI models, its rapid expansion faces growing challenges, including soaring energy demands and US semiconductor restrictions.
China’s president, Xi Jinping, remains resolute, stating: “Whoever can grasp the opportunities of new economic development such as big data and artificial intelligence will have the pulse of our times.” He sees AI driving “new quality productivity” and modernising China’s manufacturing base, calling its “head goose effect” a catalyst for broader innovation.
To counter western containment, China has embraced a “guerrilla” economic strategy, bypassing restrictions through alternative trade networks, deepening ties with the global south, and exploiting weaknesses in global supply chains. Instead of direct confrontation, this decentralised approach uses economic coercion to weaken adversaries while securing China’s own industrial base.
Video: AP.
China is also leveraging open-source AI as an ideological tool, presenting its model as more collaborative and accessible than western alternatives. This narrative strengthens its global influence, aligning with nations seeking alternatives to western digital control. While strict state oversight remains, China’s embrace of open-source AI reinforces its claim to a future where innovation is driven not by corporate interests but through shared collaboration and global cooperation.
But while DeepSeek claims to be open access, its secrecy tells a different story. Key details on training data and fine-tuning remain hidden, and its compliance with China’s AI laws has sparked global scrutiny. Italy has banned the platform over data-transfer risks, while Belgium and Ireland launched privacy probes.
Under Chinese regulations, DeepSeek’s outputs must align with state-approved narratives, clashing with the EU’s AI Act, which demands transparency and protects political speech. Such “controlled openness” raises many red flags, casting doubt on China’s place in markets that value data security and free expression.
Many western commentators are seizing on reports of Chinese AI censorship to frame other models as freer and more politically open. The revelation that a leading Chinese chatbot actively modifies or censors responses in real time has fuelled a broader narrative that western AI operates without such restrictions, reinforcing the idea that democratic systems produce more transparent and unbiased technology. This framing serves to bolster the argument that free societies will ultimately lead the global AI race.
But at its heart, the “AI arms race” is driven by technological dominance. The US, China, and the EU are charting different paths, weighing security risks against the need for global collaboration. How this competition is framed will shape policy: lock AI behind restrictions, or push for open innovation.
DeepSeek, for all its transformational qualities, continues to exemplify a model of AI where innovation prioritises scale, speed and efficiency over societal impact. This drive to optimise computation and expand capabilities overshadows the need to design AI as a truly public good. In doing so, it eclipses this technology’s genuine potential to transform governance, public services and social institutions in ways that prioritise collective wellbeing, equity and sustainability over corporate and state control.
A truly global AI framework requires more than political or technological openness. It demands structured cooperation that prioritises shared governance, equitable access, and responsible development. Following a workshop in Shanghai hosted by the Chinese government last September, the UN’s general secretary, António Guterres, outlined his vision for AI beyond corporate or state control: “We must seize this historic opportunity to lay the foundations for inclusive governance of AI – for the benefit of all humanity. As we build AI capacity, we must also develop shared knowledge and digital public goods.”
Both the west and China frame their AI ambitions through competing notions of “openness” – each aligning with their strategic interests and reinforcing existing power structures.
Western tech giants claim AI drives democratisation, yet they often dominate digital infrastructure in parts of Africa, Asia and Latin America, exporting models based on “corporate imperialism” that extract value while disregarding local needs. China, by contrast, positions itself as a technological partner for the rest of the global south; however, its AI remains tightly controlled, reinforcing state ideology.
China’s proclaimed view on international AI collaboration emphasises that AI should not be “a game of rich countries”“, as President Xi stated during the 2024 G20 summit. By advocating for inclusive global AI development, China positions itself as a leader in shaping international AI governance, especially via initiatives like the UN AI resolution and its AI capacity-building action plan. These efforts help promote a more balanced technological landscape while allowing China to strengthen its influence in global AI standards and frameworks.
However, beneath all these narratives, both China and the US share a strategy of AI expansion that relies on exploited human labour, from data annotation to moderation, exposing a system driven less by innovation than by economic and political control.
Seeing AI as a connected race for influence highlights the need for ethical deployment, cross-border cooperation, and a balance between security and progress. And this is where China may face its greatest challenge – balancing the power of open-source innovation with the constraints of a tightly controlled, authoritarian system that thrives on restriction, rather than openness.
To hear about new Insights articles, join the hundreds of thousands of people who value The Conversation’s evidence-based news. Subscribe to our newsletter.
Peter Bloom does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
Source: The Conversation – UK – By Ivo Vlaev, Professor of Behavioural Science, Warwick Business School, University of Warwick
Sir Keir Starmer has become the first sitting UK prime minister to publicly take an HIV test to reduce stigma around Aids and encourage more people to get tested.
There are historical parallels. In 1956, when Elvis Presley, at the height of his fame, was filmed receiving his polio vaccine on US television.
Do these high-profile gestures really change attitudes and behaviour, or are they just headline-grabbing stunts?
A closer look at the behavioural science behind celebrity endorsements suggests that, under the right conditions, public demonstrations by famous figures can indeed shift social norms, reduce stigma and influence health outcomes. However, the effects depend a lot on the credibility of the endorser, the authenticity of the act and the presence of sustained, follow-up campaigns.
Elvis Presley’s polio jab is one of the most iconic examples of celebrity-led health campaigns. But many other well-known figures have encouraged the public to adopt protective health measures, from actors promoting annual flu jabs to footballers advocating organ donation drives.
The premise is that a celebrity’s endorsement can normalise certain behaviour by tapping into the principles of “social learning theory”, particularly observational learning. That is, when we see someone we admire or trust do something, we are more likely to follow suit.
In the 1950s, polio was a serious threat, capable of causing paralysis or death. After witnessing Elvis roll up his sleeve on national television, many teenagers – previously sceptical or apathetic – became far more willing to accept the polio vaccine. That event is now hailed as a masterclass in leveraging popular culture to address a public health crisis.
A masterclass in leveraging popular culture.
A cornerstone of behavioural science is the recognition that who delivers a message can be as important as – or sometimes more important than – what the message contains. The so-called “messenger effect” highlights how we are often more persuaded by people we perceive to be credible, relatable or high status.
In the case of Elvis, he was already idolised by millions. He was the perfect conduit to promote vaccination among teenagers who might otherwise dismiss appeals from older authority figures.
Starmer occupies a different kind of influence. Supporters of the Labour party may see him as a trustworthy figure, while others could be sceptical of a politician’s motives. This underscores a key aspect of the messenger effect: if a large segment of the target audience views the figure as partisan or self-serving, the endorsement can backfire or simply fail to register.
Another powerful effect identified in behavioural science is social norms – our shared understandings of what is typical or appropriate – which strongly influence whether we take certain actions.
Stigma around HIV remains a major barrier to testing and treatment. Even though medical advances have changed the landscape of HIV/Aids care, many people still fear the societal consequences of a positive diagnosis. According to the UK Health Security Agency, around 5,000 people in the UK are unaware they are living with HIV, partly because they hesitate to test in the first place.
By publicly taking an HIV test, Starmer aimed to shift perceptions and normalise testing. In terms of social identity theory, seeing a prominent figure within the national community – especially one involved in shaping policies – undergo testing can communicate that “people like us” view HIV testing as a routine, responsible health measure. This may be particularly powerful for people who identify politically with Starmer or who respect his leadership position.
Despite the potential of celebrity or high-profile endorsements, behavioural science also points to authenticity as a vital ingredient. Audiences are more likely to change their behaviour if they believe the celebrity genuinely cares about the issue rather than simply seeking publicity. If endorsements are perceived as insincere or politically opportunistic, their effect can be muted or even counterproductive.
In Elvis’s case, he was known for engaging with young fans and had a track record of public good works, which helped bolster the sense that his polio vaccination was done for more than just a publicity boost.
For Starmer, sustaining the momentum beyond a single test – through continued advocacy, support of free testing programmes, and visibility in HIV-awareness campaigns – could reinforce the perception of a real commitment rather than a fleeting photo opportunity.
Nudges
Behavioural scientists also often talk about “nudges” – small interventions that change people’s choices without forbidding options or significantly changing incentives. A celebrity endorsement can serve as a nudge by making a desirable health behaviour (like getting tested) more top-of-mind or socially acceptable.
However, historically, Elvis’s vaccination was not a standalone act. It was part of a broader public health strategy involving schools, local campaigns and continued outreach. Those elements ensured that once people were motivated to get the polio jab, they could do so easily.
For HIV testing, the same principle applies: visible leadership from Starmer may spark initial interest, but practical measures – such as pop-up testing centres, free home-test kits and confidential testing support – are vital to maintain engagement.
Is Keir Starmer the new Elvis? In reality, the two scenarios differ in time and context. A 21st-century political leader raising awareness about HIV testing in the UK operates within a more complex media landscape than a 1950s rock ’n’ roll icon on American primetime television. Yet, there is a parallel: both used their public status to tackle a widespread health concern, hoping to overcome stigma and promote an important preventative measure.
Ultimately, celebrity moments can open the door, but only a sustained, evidence-based strategy will keep it open – and encourage people to walk through.
Anyone in England can order a free and confidential HIV test from www.freetesting.hiv to do the test at home.
Ivo Vlaev does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
Grown in Ecuador (Équateur en français), sold in Paris. Robert Crum / shutterstock
As you read this, planes full of roses are heading from east Africa and South America to almost every corner of the world. If you buy someone a rose this Valentine’s Day, it may be in the air right now or perhaps in a refrigerated warehouse in the Netherlands.
A huge logistical operation ensures those flowers are timed to be perfectly in bloom on the 14th. From flower farm to bouquet can take just a few days. In all, hundreds of millions of roses will be shipped internationally this week, and many will die before they can be sold.
Can all this flying be justified?
You’re reading the Imagine newsletter – a weekly synthesis of academic insight on solutions to climate change, brought to you by The Conversation. I’m Will de Freitas, energy and environment editor, covering for my colleague Jack Marley who is lovesick. This week, we’re looking at flowers.
Many people don’t realise just how far a Valentine’s rose has probably travelled. Though roses can be grown in the UK (and some species are native), most of them won’t flower for at least another few months.
Jill Timms and David Bek, academics at the University of Coventry who have researched the global flower trade point out: “This sort of localised growing does not satisfy the demand for volume, variety and year-round supply, or indeed guarantee sustainability in terms of energy, pesticide use and so on.”
This means most roses are imported from countries with more land, more sunshine, and a cheaper workforce. Major growers include Colombia, Ecuador, Kenya and Ethiopia. The Netherlands is actually the biggest exporter of roses, partly due to its own production in greenhouses but mostly thanks to its position as a crucial hub for the global trade. Flowers sent to the UK from the Netherlands were probably grown elsewhere.
To ensure they stay fresh, those flowers are kept cool as they’re transported in a series of refrigerated lorries, planes or boats, while some are sprayed with chemicals to freeze them.
“Geography matters,” say Timms and Bek. “Some flowers travel by sea, some cargo plane and others in the hold of passenger jets, all with very different carbon footprints.”
Figuring out a flower’s carbon footprint is not straightforward. Jennifer Lavers and Fiona Kerslake from the University of Tasmania compared cut flowers grown in heated or refrigerated greenhouses in the Netherlands with those grown in Kenya.
“Maintaining the controlled environmental conditions inside these [Dutch] buildings requires artificial light, heat and cooling, so each rose grown in The Netherlands contributes an average of around 2.91kg of CO₂ to the atmosphere.”
“In contrast”, they write, “a single rose grown on a farm in Kenya contributes only 0.5kg. This is largely because Kenyan hot houses do not use artificial heating or lighting, and most farm workers walk or cycle to work. As a result, flowers grown in tropical regions are sometimes considered low-carbon (of course, this doesn’t always factor in international transport).”
Paul D. Larson of the University of Manitoba points out that, while local production would ground some of the international flower flights, “growing flowers in greenhouses can use as much energy as shipping them [to North America] from Colombia by air freight”.
Larson, a professor of supply chain management, does highlight one major issue with “low carbon” flowers in the global south, however:
“Since flowers are not classified as edible, they are often exempt from pesticide regulations. Thus, many flower production workers in Ecuador and Colombia have suffered from respiratory problems, rashes and eye infections caused by exposure to toxic chemicals in fertilizers, fungicides and pesticides.”
The flower trade in Ecuador and Colombia was actually engineered a few decades ago to try and stem the flow of cocaine into the US, says Jay L. Zagorsky, an associate professor at Boston University’s business school.
“One part of the strategy was to convince farmers in Colombia to stop growing coca leaves – a traditional Andean plant that provides the raw ingredient for making cocaine – by giving them preferential access to US markets if they grew something else.”
Whether this policy helped stop drug production is unclear, says Zagorsky, but American domestic rose growing has collapsed and “many businesses in Colombia and Ecuador started growing and shipping flowers north”.
No one expects you to know exactly how a flower was grown, what conditions were like for workers, or to conduct a full “life cycle assessment” of their carbon footprint. But what can you do to help this Valentine’s Day?
Timms and Bek, the flower trade experts at Coventry University, wrote about five ways to ensure your flowers are ethical. They contrast flowers grown in the Netherlands and Kenya and say that “your priorities need to guide your purchase: environmental issues include carbon footprint, chemical use, ecological degradation and water use; social issues include health and safety standards, gender discrimination, precarious employment and land rights.”
Source: United States Senator for North Dakota John Hoeven
02.12.25
WASHINGTON – Senator John Hoeven issued the following statement after President Donald Trump nominated North Dakota’s State Superintendent of Public Instruction Kirsten Baesler to serve as the Assistant Secretary for Elementary and Secondary Education (OESE) at the U.S. Department of Education (ED).Hoeven worked with incoming Education Secretary Linda McMahon during her time as the Director of the Small Business Administration (SBA) and recommended to McMahon both in person and over the phone that Baesler be nominated to this position.
“We appreciate President Trump and Department of Education Secretary-elect McMahon nominating Kirsten to this position. Kirsten has done a tremendous job overseeing the education of students in North Dakota and will be a great asset to the Trump administration,” said Hoeven. “Kirsten has spent her career focused on education and has experience ranging from teaching in a classroom to leading the NDPPI. We congratulate her on her nomination and will work with our colleagues to ensure she is confirmed by the Senate as quickly as possible.”
Baesler has served as state school superintendent since January 2013, where she leads the 86-person team responsible for overseeing the education of both public and nonpublic school students in North Dakota. Prior to her election as superintendent, Baesler spent 24 years working in the Bismarck Public School System including as a vice principal, classroom teacher and library media specialist. She spent nine years on the Mandan School Board, serving as president of the board for seven years. Baesler is a native of Flasher and graduated from Bismarck State College, Minot State University and Valley City State University.
For the first time, the FSP presented its award, recognizing the most outstanding and talented athletes, coaches and partners.
The award ceremony took place as part of the FonCode 2024 sports programming tournament. The competition began with an online qualifying round, in which more than a thousand programmers participated. The 128 best athletes reached the final duels for the prize fund of 1 million rubles.
The tournament culminated in a duel between Fedor Romashov and Ho Dang Dung. The outcome of the match remained unpredictable until the end — the main round ended in a draw. Only an additional task brought victory to Fedor Romashov, a third-year bachelor’s student “Applied Mathematics and Computer Science”.
It is not surprising that Fedor also became the best sports programmer of 2024 according to the FSP.
“It was unexpected and pleasant to receive the FSP award,” noted Fedor Romashov. “I had quite a few cool victories last year. My team and I won ICPC 2023, which I consider to be my best and most important achievement in this area. I am also glad victory inGamesbof the future “It’s also an incredible experience.”
The “Coach of the Year” award was received by Mikhail Gustokashin, Director of the Center for Student Olympiads at the Faculty of Computer Science. In 2024, under Mikhail’s leadership, one of the teams from the Faculty of Computer Science became world champions at the International Student Programming Contest (ICPC), and the second team received a gold medal for 3rd place.
“The Federation of Sports Programming holds many competitions with interesting tasks and experimental formats. Our students take part in these competitions with pleasure and achieve great success. Recognition from the FSP is very pleasant, we will continue to work!” – concluded Mikhail Gustokashin.
Text: Alexandra Sytnik
Please note: This information is raw content directly from the source of the information. It is exactly what the source states and does not reflect the position of MIL-OSI or its clients.
While the first world war and the Spanish civil war had already drawn children in Europe and beyond into the orbit of conflict, the second world war marked a pivotal period in how young people have experienced the horrors of war.
During the 1940s, children faced unprecedented mobilisation and violence. From bombings and massacres to forced displacement and genocide, the impact was staggering. Millions of children were directly affected by these atrocities, while countless others endured the indirect consequences: shortages, family separations and grief.
In the aftermath of the war, childhood experts such as pediatricians, psychologists and nutritionists, as well as political leaders and humanitarian workers, feared for this potentially “lost generation”. With recognition of the vulnerability of children as a social group, there was a transnational push to implement protective measures. This shared awareness led to milestones such as the establishment of the United Nations International Children’s Emergency Fund (UNICEF) in December 1946 and, later, the adoption of the Declaration of the Rights of the Child.
The period from 1939 to 1949 not only highlighted the need to protect children worldwide, but also underscored their importance in building a peaceful future. As detailed in La Seconde Guerre mondiale des enfants (The second world war of children), published in September 2024 by Presses Universitaires de France, children embodied hope for postwar nations. They were seen not only as victims of war but also as active participants in shaping a peaceful world.
Schools as foundations of reconstruction
After 1945, schools became central to Europe’s social reconstruction. Seen as spaces of socialisation that included nearly all children, schools were viewed as critical for rebuilding society. Some measures mirrored those introduced after the first world war. Children, particularly those aged 6 to 14 (the typical age for compulsory education in Europe), were tasked with preserving the memory of fallen soldiers, resistance fighters and civilian victims. They cleaned and adorned graves, attended public ceremonies and paid homage to the dead.
However, postwar education went further. In some countries, particularly those that formerly had authoritarian or totalitarian regimes such as Italy and Germany, school curricula underwent significant transformation. Lessons on democratic governance and peaceful figures were either reinforced or reintroduced, and history classes began emphasising cultural, political and economic exchanges between nations. These reforms aimed to counteract the nationalist ideologies that had fuelled war and division.
Unlike the post-WWI era, the years after 1945 saw efforts to strengthen ties between nations by fostering connections among their youngest citizens. Programs promoting international school exchanges flourished. French students corresponded with Canadian peers, British children sent books to Germans and Swedish students traveled to Belgium.
Germany hosted one of the most ambitious programs: the US-led “World Friendship Among Children Program”. This initiative included pen-pal projects, student travel and even the symbolic adoption of war orphans by classrooms. The program also established the “World Friendship Council of the Future”, where young people proposed initiatives for international dialogue, mimicking the operations of newly formed organisations such as the United Nations, UNESCO (United Nations Educational, Scientific and Cultural Organization) and the World Health Organization.
It was also in Germany that Houses of America, or Youth Centres, were established. While the goal was to offer children sports and cultural activities, they were primarily seen by Americans as tools of soft power and political instruments to (re)educate youth about the principles of democracy.
Active pedagogy for European education
Indeed, after 1945, educating children for peace also meant educating them about democracy. Across Western Europe, teaching methods inspired by progressive education movements – championed by figures such as Maria Montessori, Ovide Decroly and John Dewey – became widespread.
For educational leaders, merely teaching democratic principles wasn’t enough: children needed to practice them. Classrooms became miniature societies where students elected class representatives, voted on school matters and debated everyday and political issues. This active engagement aimed to cultivate civic responsibility and critical thinking.
Some postwar experiments went further. Communities of children or “children’s republics” emerged across Europe to provide homes for children who had lost their homes and parents. While their primary mission was humanitarian, these communities were also intended to form the foundations of new, peaceful societies. Self-governance was central to their goal of preparation for active citizenship. In the Repubblica dei Ragazzi (boys’ republic) in Santa Marinella, near Rome, children ran their own court, deliberative assembly and union.
Ideological differences
While schools are indeed the cornerstone of global peacebuilding, debates about fostering peace go beyond the classroom to encompass all aspects of children’s lives. This includes the private sphere, as evidenced by numerous transnational legislative efforts to ban violent comic books and war-themed toys, which are accused of inciting aggression in children and thus threatening a peaceful future.
This surge of post-WWII initiatives underscores the fact that educating for peace and democracy was a European – if not global – project. However, its interpretation varied depending on country and region. In France, West Germany and Italy, the project was rooted in liberal ideals; in Eastern Europe, it reflected a different understanding of democracy.
In the West, the focus was on the individual, with boys and girls assigned traditional, gendered roles: girls were encouraged to become future mothers, while boys were groomed to be workers contributing to economic growth. In contrast, the Eastern model emphasised collective values within a socialist framework, promoting more egalitarian relationships between boys and girls, albeit in service of political objectives.
Regardless of ideological differences, these post-1945 initiatives left a lasting legacy. Their influence can still be seen today in school activities such as student elections and class trips, which continue to echo the democratic ideals of that era.
Camille Mahé ne travaille pas, ne conseille pas, ne possède pas de parts, ne reçoit pas de fonds d’une organisation qui pourrait tirer profit de cet article, et n’a déclaré aucune autre affiliation que son organisme de recherche.
The Government of Saskatchewan is taking further action to support growing student enrolment by investing an additional $29.5 million in relocatable classrooms. This mid-year funding increase brings the 2024-25 total investment in relocatables to $58 million, providing 76 new relocatables to help alleviate space pressures in schools across the province.
“With Saskatchewan’s growing population, we recognize the need for additional classroom space to support students and educators,” Education Minister Everett Hindley said. “This additional investment will ensure schools that anticipate capacity challenges in 2025-26 have the necessary infrastructure to accommodate students.”
The majority of the relocatables will be allocated to the fastest growing cities of Regina and Saskatoon. In addition, the communities of Clavet, Corman Park, Humboldt, Lloydminster, Martensville, Pilot Butte, Warman, and Weyburn will receive relocatables to alleviate space pressures.
School divisions will proceed with procurement and target installation prior to September 2025.
Source: The Conversation – Canada – By Jennifer Clapp, Professor and Canada Research Chair in Global Food Security and Sustainability, and Member of the International Panel of Experts on Sustainable Food Systems, University of Waterloo
History has shown us again and again that, so long as inequality goes unchecked, no amount of technology can ensure people are well fed.
Today, the world produces more food per person than ever before. Yet hunger and malnutrition persist in every corner of the globe — even, and increasingly, in some of its wealthiest countries.
The major drivers of food insecurity are well known: conflict, poverty, inequality, economic shocks and escalating climate change. In other words, the causes of hunger are fundamentally political and economic.
The urgency of the hunger crisis has prompted 150 Nobel and World Food Prize laureates to call for “moonshot” technological and agricultural innovations to boost food production, meaning monumental and lofty efforts. However, they largely ignored hunger’s root causes — and the need to confront powerful entities and make courageous political choices.
Food is misallocated
To focus almost exclusively on promoting agricultural technologies to ramp up food production would be to repeat the mistakes of the past.
The Green Revolution of the 1960s-70s brought impressive advances in crop yields, though at considerable environmental cost. It failed to eliminate hunger, because it didn’t address inequality. Take Iowa, for example — home to some of the most industrialized food production on the planet. Amid its high-tech corn and soy farms, 11 per cent of the state’s population, and one in six of its children, struggle to access food.
Even though the world already produces more than enough food to feed everyone, it’s woefully misallocated. Selling food to poor people at affordable prices simply isn’t as profitable for giant food corporations.
They make far more by exporting it for animal feed, blending it into biofuels for cars or turning it into industrial products and ultra-processed foods. To make matters worse, a third of all food is simply wasted.
Meanwhile, as the laureates remind us, more than 700 million people — nine per cent of the world’s population — remain chronically undernourished. A staggering 2.3 billion people — more than one in four — cannot access an adequate diet.
Women queue up to receive food distributed by local volunteers at a camp in Somalia in May 2019. Conflicts hinder the effective delivery of humanitarian aid during food security crisis. (AP Photo/Farah Abdi Warsameh)
Confronting inequity
Measures to address world hunger must start with its known causes and proven policies. Brazil’s Without Hunger program, for example, has seen dramatic 85 per cent reduction in severe hunger in just 18 months through financial assistance, school food programs and minimum wage policies.
Our politicians must confront and reverse gross inequities in wealth, power and access to land. Hunger disproportionately affects the poorest and most marginalized people, not because food is scarce, but because people can’t afford it or lack the resources to produce it for themselves. Redistribution policies aren’t optional, they’re essential.
Governments must put a stop to the use of hunger as a weapon of war. The worst hunger hotspots are conflict zones, as seen in Gaza and Sudan, where violence drives famine. Too many governments have looked the other way on starvation tactics — promoting emergency aid to pick up the pieces instead of taking action to end the conflicts driving hunger.
Governments must also break the stranglehold of inequitable trade rules and export patterns that trap the poorest regions in dependency on food imports, leaving them vulnerable to shocks.
Instead, supporting local and territorial markets is critical in helping build resilience to economic and supply chain disruptions. These markets provide livelihoods and help ensure diverse, nutritious foods reach those who need them.
Mitigating and adapting to climate change requires massive investments in transformative approaches that promote resilience and sustainability in food systems.
Agroecology — a farming system that applies ecological principles to ensure sustainability and promotes social equity in food systems — is a key solution, proven to sequester carbon, build resilience to climate shocks and reduce dependence on expensive and environmentally damaging synthetic fertilizers and pesticides.
More research should explore agroecology’s full potential. And we must adopt plant-rich, local and seasonal diets, ramp up measures to tackle food waste and reconsider using food crops for biofuels.
This means pushing back against Big Meat and biofuel lobbies, while investing in climate-resilient food systems.
Bold political action needed
This is not to say that technology has no role — all hands need to be on deck. But the innovations most worth pursuing are those that genuinely support more equitable and sustainable food systems, and not corporate profits. Unless scientific efforts are matched by policies that confront power and prioritize equity over profit, then hunger is likely to here to stay.
The solutions to hunger are neither new nor beyond reach. What’s missing is the political will to address its root causes.
This message is shared by my colleagues with the International Panel of Experts on Sustainable Food Systems, IPES-Food, whose work covers a range of expertise and experience. Hunger persists because we allow injustice to endure. If we are serious about ending it, we need bold political action, not just scientific breakthroughs.
Jennifer Clapp receives funding from the Canada Research Chairs program and the Social Sciences and Humanities Research Council of Canada. She is a member of the International Panel of Experts on Sustainable Food Systems (IPES-Food).
At a recent summit in Dar Es Salaam, Tanzania, leaders of eight African states released a statement calling for an immediate and unconditional ceasefire in the Democratic Republic of Congo (DRC).
The statement comes after a flareup in fighting in eastern DRC that has killed hundred and wounded thousands.
On Jan. 31, 2025 the rebel group known as the March 23 Movement (M23) captured the city of Goma in the eastern DRC. At a news conference, Corneille Nangaa, leader of the Congo River Alliance that includes M23, declared that they were there to stay and would march to the DRC’s capital of Kinshasa.
The World Health Organization reported 900 bodies had been recovered from the streets of Goma, with about 3,000 people injured and thousands forced to flee. The Congolese government said that it had started burying more than 2,000 people and thousands had been displaced.
On Feb. 4, 2025, the Congo River Alliance declared a ceasefire. This isn’t the first time M23 attacked Goma and then declared a ceasefire. The renewed violence is the latest in a long-running conflict in the region that has grown to involve local militias, regional countries and foreign companies seeking to exploit Congo’s mineral wealth.
However, a faction within the CNDP disapproved of the Goma agreement and created a militia group in 2012 that came to be known as M23. A United Nations group has said senior government officials from Rwanda and Uganda have provided M23 with weapons, intelligence and military support.
The roots of the conflict lie in the history of Belgium’s colonial rule of the region that pitted the Tutsi and Hutu ethnic groups against each other. In 1956, ethnic tensions in Rwanda forced many Tutsis to seek refuge in Congo (then Zaire), Uganda, Tanzania and beyond.
Tutsis who fled to Congo and Uganda were not accorded full citizenship rights, and this led to resentment.
In the mid-1990s, Rwandan President Paul Kagame and Ugandan President Yoweri Museveni collaborated with Congolese rebel leader Laurent-Désiré Kabila to create the AFDL. The group waged the First Congo War from October 1996 to May 1997 that ended with the overthrow of the DRC’s long-time ruler, Mobutu Sese Seko. Kabila became president.
Kagame and Museveni fought along with Congolese Tutsis to assert their citizenship once the war ended. However, when Kabila turned against his backers, it led to the waged Second Congo War from 1998 to 2003, with Rwandan and Ugandan-backed militas fighting against the DRC government.
The FDLR was implicated in orchestrating the 1994 Rwandan genocide that killed 800,000 people, most of whom were Tutsi. The FDLR has been based in eastern Congo since 1996, after the Rwandan Patriotic Front, led by Kagame and others, pushed them out of Rwanda.
Fear of the FDLR was one of the drivers for the First Congo War. In a recent interview with CNN, Kagame said:
“If you want to ask me, is there a problem in Congo that concerns Rwanda? And that Rwanda would do anything to protect itself? I’d say 100 per cent.”
Control of minerals
Before the fall of Goma in February 2025, M23 captured mineral-rich areas like Rubaya, the largest coltan mine in the Great Lakes region; Kasika and Walikale, where there are numerous gold mines; Numbi, which is rich with tin, tungsten, tantalum and gold; and Minova, which is a trade hub.
In December 2024, a UN expert group noted that M23 exported about 150 tonnes of coltan to Rwanda, and was involved with Rwanda’s production, leading to “the largest contamination of mineral supply chain.”
One of the central dynamics of this conflict is the control and profit from natural resources. The DRC is rich in minerals and metals needed around the world, including the critical minerals used in the technology and renewable energy industries.
The World Bank has noted that the “DRC is endowed with exceptional mineral resources.” However, administration of the sector is dysfunctional and handicapped by insufficient institutional capacity.
Ending the M23 insurgency requires taking Tutsi citizenship seriously. Politics researcher Filip Reyntjens has argued that any peaceful transition in the DRC needed to take regional countries seriously. He emphasized:
“By turning a blind eye to Rwanda’s hegemonic claims in eastern Congo, the future stability of the region remains in doubt. Rwanda may once again, in the not too distant future, become the focal point of regional violence.”
A factor contributing to the violence is the lack of measures to ensure ceasefires are respected by different parties engaged in conflicts. In addition, armed groups and their backers have not been effectively prosecuted. A 2010 UN mapping report describes 617 alleged war crimes, crimes against humanity and human rights between March 1993 and June 2003. No perpetrators have never been prosecuted.
Furthermore, there must be strong international efforts to prevent conflict minerals from getting into international supply chains. M23 and other militias smuggle Congo’s minerals through regional neighbours, where they are considered conflict-free.
Tech giants that rely on these minerals must do more to scrutinize where they come from. Equally, all of us, as consumers of products made from the DRC’s minerals, must demand accountability.
It’s usually only men who participate in such talks. Women, who endure the brutality of sexual violence and other human rights violations, must be represented in peace and security talks.
In his 2018 Nobel Peace Prize acceptance speech, Congolese physician and human rights activist Dr. Dennis Mukwege noted that:
“What is the world waiting for before taking this into account? There is no lasting peace without justice. Yet, justice in not negotiable. Let us have the courage to take a critical and impartial look at what has been going on for too long in the Great Lakes region.”
To effectively respond to the plight of the people of eastern Congo will take more than situational and short-term intervention. National, regional and international parties must negotiate peaceful and just access to minerals. Peace and security in Congo will happen when sectarian and partisan politics is replaced with commitment to democracy, sovereignty and peoples’ well-being.
Evelyn Namakula Mayanja receives funding from the Social Sciences and Humanities Research Council Canada and Carleton University.
How might you make your mark on the world forever? Write a play more timeless than Shakespeare, or compose music to out-do Mozart, or score the winning goal in the next World Cup final, perhaps?
There’s an easier way of leaving an indelible mark on our planet. Just finish a soft drink and toss the can (and the remains of the chicken dinner that went with it), ditch last year’s impulse purchases from your wardrobe, resurface that old patio, upgrade your mobile phone … simply carry on with everyday life, that is, and you’ll likely leave a fascinating legacy. It might last a billion years.
We’re palaeontologists, and have spent our careers looking at the fossil record of the deep past, puzzling out how those magnificent animal and plant relics have been preserved as dinosaur bones, the carapaces of ancient crustaceans, lustrous spiralled ammonites, petrified flower petals and many more. Often they still have exquisite detail intact after millions of years.
We’ve now turned our attention to the myriad everyday objects that we make and use, to see what kind of future fossils – we call them technofossils – they will make. We’ve written about this in our new book, Discarded: how technofossils will be our ultimate legacy. Here are some key messages:
The first things that’ll catch the eye of any far-future palaeontologist are our manufactured objects – buildings, roads, machines and so on. In recent decades, they have rocketed in amount to over a trillion tonnes, to now outweigh all living things on Earth. That’s a lot of raw material for generating future fossils.
Then, most things we make are designed to be durable, to resist corrosion and decay, and are significantly tougher than the average bone or shell. Just from that they have a head start in the fossilisation stakes.
Many are new to the Earth. Discarded aluminium cans are everywhere, for instance, but to our planet, they’re a wondrous novelty, as pure aluminium metal is almost unknown in nature. In the past 70 years we’ve made more than 500 million tonnes of the stuff, enough to coat all of the US (and part of Canada) in standard aluminium kitchen foil.
What’s going to happen to it? Aluminium resists corrosion, but not forever. Buried underground in layers of mud and sand, a can will slowly break down, but often not before there’s a can-shaped impression in these new rocks, lined with microscopic clay crystals newly-grown out of the corroding aluminium.
Everyday items can be flushed onto a floodplain and be quickly buried under sediments. As they slowly degrade they may leave an impression on the soft muds and silts for future palaeontologists to puzzle over. Sarah Gabbott
Having been shielded from ultraviolet light, the thin plastic liner inside the can may endure too. (Oil-based plastic is even more novel in geological terms, being entirely non-existent until the 20th century). These two materials compressed side-by-side represent future fossil signatures of our time on Earth.
Billions of fossilised chicken thighs
But what about bones – the archetypal fossil relic? There will be many of these as future fossils, stark evidence of our species’ domination over others.
The standard supermarket chicken seems mundane. But it’s now by far the most common bird of all, making up about two-thirds of all bird biomass on Earth, and its abundance in life increases its fossilisation chances after death.
We stack the odds further by tossing the bones into a plastic bin-bag, that’s then carted to the landfill site to join countless more bones for burial in neatly engineered compartments – also plastic-lined. There, the bones will begin to mummify, another useful step in the road to petrifaction. Our landfills are giant middens of the future and will be stuffed full of the bones of this one species.
Geologists of the far future may conclude that chickens could only have existed thanks to a more intelligent species. dba87 / shutterstock
These bones – super-sized but weak, riddled with osteoporosis, sometimes fractured and deformed – will tell their own grisly story. Future geologists will puzzle over a suddenly-evolved bird so abundant yet so physically helpless. Will they figure out the story of a broiler chicken genetically
engineered to feed relentlessly to maximise weight gain, for slaughter just five or six weeks after hatching? We suspect the fossil evidence will be damning.
Fossilised fleeces
Fossilizeable fashion is also new. Humans have worn clothes for thousands of years, but archaeological clothes discoveries are rare, because made of natural fibres they are feasted on by clothes moths, microbes and other scavengers. Fossil fur and feathers are rare too, for the same reasons.
But cheap, cheerful and hyper-abundant polyester fashion is quite different. There’s no need for mothballs with these garments because synthetic plastics are indigestible to most microbes. How long might they last? Some ancient fossil algae have coats of plastic-like polymers, and these have lasted, beautifully preserved, for many millions of years.
Fossil clothes will surely perplex far-future palaeonologists, though: first to work out their shape from the crumpled and flattened remains, and then to work out what purpose they served. With throwaway fashion, we’re making some eternal puzzles.
Concrete and computers
The lumps of concrete from your old patio are not any old rocks. The recipe for concrete, involving furnace-baked lime, is rare on Earth (the minerals involved occasionally form in magma-baked rock), but humans have made it hyper-abundant. There are now more than half a trillion tonnes of concrete on Earth, mostly made since the 1950s – that’s a kilo per square metre averaged over the Earth. And concrete is hard-wearing even by geological standards: most of its bulk is sand and gravel, which have been survivors throughout our planet’s history.
There’s nothing old about computers and mobile phones, but they are based on the same element – silicon – that makes up the quartz (silicon dioxide) of sand and gravel. A fossilised silicon chip will be tricky to decipher, though: the semiconductors now packed on to them are just nanometres across, tinier than most mineral forms geologists analyse today.
But the associated paraphernalia, the burgeoning waste of keyboards, monitors, wiring, will form more obvious fossils. The patterns on these, like the QWERTY keyboard, resemble the fossil patterns seized upon by today’s palaeontologists as clues to ancient function. That would depend on the excavators, though: fossil keyboards would make more sense to hyper-evolved rats with five-fingered paws, say, than superintelligent octopuses of the far future.
It’s fun to conceptualise like this, and set the human story within the grand perspective of Earth’s history. But there’s a wider meaning. Tomorrow’s future fossils are today’s pollution: unsightly, damaging, often toxic, and ever more of a costly problem. One only has to look at the state of Britain’s rivers and beaches.
Understanding how fossilisation starts now helps us ask the right questions. When plastic trash is washed out to sea, will it keep travelling or become safely buried, covered by marine sediments? Will the waste in coastal landfill sites stay put, or be exhumed by the waves as sea level rises? The answers will be found in future rocks – but it would help us all to work them out now.
Sarah Gabbott is affiliated with Green Circle Nature Regeneration Community Interest Company 13084569.
Jan Zalasiewicz does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
Source: United States Senator Ron Wyden (D-Ore)
February 12, 2025
Washington D.C.— U.S. Senator Ron Wyden today announced that Andrew Cutler has begun work as his new field representative for Eastern Oregon, covering Baker, Gilliam, Grant, Harney, Malheur, Morrow, Sherman, Umatilla, Union, Wallowa and Wheeler counties.
“I’m gratified to have someone on my team as knowledgeable and passionate as Andrew is about issues in Eastern Oregon,” Wyden said. “A field representative’s role is about being a region’s eyes and ears, hearing directly from locals about logical and meaningful solutions to the area’s specific challenges. I know Andrew will work hard to support and represent his fellow Eastern Oregonians in any way he can to shorten the distance between our state and Washington, DC.”
Cutler, a Treasure Valley Community College and Boise State University alum, comes into this role with a wealth of knowledge about the region. Prior to joining Wyden’s staff, Cutler was the regional editorial director for the EO Media Group from July 2020 to June 2024, where he also served as editor and publisher for the East Oregonian and the Hermiston Herald from May 2019 to June 2024. Cutler also served as editor of The Observer in La Grande from November 2012 to December 2017, and later returned as interim editor from May 2021 to June 2024. He also was the publisher of The Observer from 2015 to 2017.
“As a resident of Eastern Oregon since 2012, I know how important it is to help the region with issues where Senator Wyden can assist, such as securing federal funds, wildfire mitigation, economic development, rural healthcare, broadband accessibility and more,” Cutler said. “I look forward to collaborating with everyone in the region to work on solutions Senator Wyden can bring back to DC to make lasting and positive impacts here at home.”
Cutler replaces Kathleen Cathey, who retired in December 2024, after serving the people of Eastern Oregon on behalf of Senator Wyden for nearly 20 years.
“Kathleen leaves huge shoes to fill after close to two decades of service, and I immensely appreciate her deep community connections that enabled her to work successfully with farmers, ranchers, veterans, educators, local officials and all residents wanting to make Eastern Oregon an even better place to live and work,’’ Wyden said. “I’m confident Andrew will keep building on those accomplishments and helping me to generate new successes.”
Cutler can be reached at andrew_cutler@wyden.senate.gov while the previous Eastern Oregon office in La Grande is moved to Pendleton.
A public consultation has been launched asking businesses and residents to comment on a vision to grow Liverpool’s multi-billion-pound economy over the next 15 years.
The Inclusive Economic Growth Strategy will set the framework for growth up to 2040 and the eight-week consultation, hosted by Liverpool City Council, aims to inform the development of the resulting action plan.
The vision for Liverpool 2040 is to create a strong and inclusive economy that leaves no one behind.
The strategy focuses on strengthening foundations to build a fairer, more prosperous, and sustainable city that creates opportunities for a good life for all its residents.
The draft strategy focuses on several key themes, including:
Strengthening key sectors to drive growth, innovation, investment and productivity Key sectors include: Health & Life Sciences, Creative and Digital industries, Advanced Manufacturing and Maritime.
Build a vibrant, productive and resilient business base
Ensure access to skills development, employment opportunities and career building
Place people at the heart of growth activity and supporting aspirations and networks
Several public engagement events will be staged over the coming months to gather views from the public. People can also go online at www.liverpool.gov.uk/growthstrategyconsultation to find out more and give their feedback.
Liverpool currently powers a £16.7 billion economy, with over 14,000 businesses and around 230,000 people in employment.
However, significant challenges remain, including low productivity and investment, financial pressures on public services, inequality of opportunity in some communities, and health challenges.
In light of these challenges, the Council, which recently submitted a New Town bid to Government to regenerate a huge part of North Liverpool, is committed to supporting businesses and residents. Delivering an inclusive economy a core pillar for Liverpool’s Strategic Partnership plan for 2040.
To further underline the Council’s commitment, since June 2023, its Business Support Service has provided advice and guidance to over 1,000 Liverpool businesses and supported 300+ residents with direct advice on starting up a new business.
The Adult Learning and Skills team has also supported over 4,500 residents to develop essential workplace skills, and the Ways to Work team has supported 1,708 economically inactive and unemployed residents with employment and skills services.
Councillor Nick Small, Liverpool City Council’s Cabinet Member for Development and Growth, said: “This draft Inclusive Economic Growth Strategy is a vital piece of work and one which will come to define the conditions that support our businesses to grow.
“Feedback to this draft strategy is crucial, it needs to reflects the views and needs of our businesses, non-profit organizations, charities, and voluntary organization – be it education, transport, housing or digital connectivity.
“We also want to hear residents’ views to ensure we create a strong, relevant and deliverable strategy, one that will inform the initiatives, interventions and investment into the infrastructure the city needs to underpin our future economy.
“All of this feedback will help us strengthen the strategy, ensure we deliver the right action for economic growth, and best placing us to build inclusivity so residents and communities thrive.”
Councillor Lila Bennett, Liverpool City Council’s Cabinet Member for Employment, Educational Attainment and Skills, said “The success of this strategy will be deeply rooted in the strength and diversity of our partnerships and our collective commitment and action. All our partners have a key role in driving economic growth and ensuring benefits are felt across all communities.
“We also want our partners, including the business community, to embrace and deliver for our residents by realising opportunities and addressing challenges, from climate change to AI, to train and upskill their workforce to be ready for the economy of the future.”
CSL Behring’s Gene Therapy HEMGENIX® (etranacogene dezaparvovec-drlb) Four Years Post-Infusion Data Continue to Show Sustained Efficacy and Safety in Adults with Hemophilia B
94 percent of patients eliminated factor IX prophylaxis and remained free of continuous prophylaxis through four years post-treatment
Mean factor IX activity levels were sustained at near normal levels of 37% through four years post-treatment, reinforcing the efficacy of HEMGENIX in the treatment of hemophilia B
Phase 3 HOPE-B data showed that a one-time treatment with HEMGENIX provided long-term bleed protection as mean adjusted annualized bleeding rate (ABR) for all bleeds was reduced by approximately 90% from lead-in as compared to year four
KING OF PRUSSIA, Pa., Feb. 7, 2025 /PRNewswire/ — Global biotechnology leader CSL (ASX:CSL; USOTC:CSLLY) today announced the four-year results from the pivotal HOPE-B study confirming the long-term durability and safety of a one-time infusion of HEMGENIX® (etranacogene dezaparvovec-drlb) for adults living with hemophilia B. In an oral presentation at the 18th Annual Congress of the European Association for Haemophilia and Allied Disorders (EAHAD), data showed that through four years, HEMGENIX continues to deliver elevated and sustained factor IX activity levels, can offer long-term and greater bleed protection compared to prophylactic treatment, can eliminate the need for routine factor IX prophylaxis, and maintains a favorable safety profile. Approved in 2022 by the U.S. Food and Drug Administration (FDA), HEMGENIX is the first gene therapy for the treatment of adults with hemophilia B who currently use factor IX prophylaxis therapy, or have current or historical life-threatening bleeding, or have repeated, serious spontaneous bleeding episodes. It is also the only approved gene therapy for hemophilia B that can treat adult patients with and without AAV5 neutralizing antibodies thereby providing the potential for a greater number of eligible patients to be treated.
“Hemophilia B can cause spontaneous bleeds into the joints, resulting in extreme pain and progressive, arthritis-like damage, which can lead to permanent physical debility,” said Steven Pipe, MD, Professor of Pediatrics and Pathology, Laurence A. Boxer Research Professor of Pediatrics and Communicable Diseases, Pediatric Medical Director, Hemophilia and Coagulation Disorders Program Director, Special Coagulation Laboratory University of Michigan. “These results underscore the ability of HEMGENIX to offer long-term bleed protection with a one-time treatment, resulting in dramatic decreases in all annual bleed rates, including joint bleeds, and sustained independence from regular prophylactic infusions.”
In the Phase III, open-label, single-dose, single-arm HOPE-B trial, 54 adult male participants with severe or moderately severe hemophilia B, with or without preexisting AAV5 neutralizing antibodies, were infused with a single dose of HEMGENIX. Of the 54 participants who received HEMGENIX, 51 completed four years of follow-up. HEMGENIX produced mean factor IX levels of 41.5 IU/dL (n=50) at year one, 36.7 IU/dL (n=50) at year two, 38.6 IU/dL (n=48) at year three and 37.4 IU/dL (n=47) at year four post-infusion. In addition, mean adjusted annualized bleeding rate (ABR) for all bleeds was reduced by approximately 90% from lead-in (4.16, n=54) as compared to year four (0.40, n=51). Furthermore, joint bleeds were reduced from a mean ABR of 2.34 at lead-in to 0.09 during year four. In year four, 94% of patients remained free of continuous prophylaxis treatment. No patients returned to continuous prophylaxis between year three and year four.
There were no serious adverse events related to treatment with HEMGENIX. HEMGENIX was generally well-tolerated, with a total of 96 treatment-related adverse events (AEs), 92 (96%) of which occurred in the first six months post-treatment. The most common adverse events were an increase in alanine transaminase (ALT), for which nine (16.7%) participants received supportive care with reactive corticosteroids for a mean duration of 81.4 days (standard deviation: 28.6; range: 51-130 days).
“These data continue to instill confidence in the clinical benefits of HEMGENIX, highlighting the remarkable impact of this one-time treatment to reduce the frequency of bleeds in people with hemophilia B and improve quality of life by alleviating the burden of ongoing factor IX prophylactic treatment,” said Andres Brainsky, Vice President R&D Hematology at CSL. “CSL is committed to continuing to provide ongoing data analyses of HEMGENIX, ensuring that healthcare providers and patients have the necessary information to make informed decisions about treatment options. We are proud to continue to provide life-changing treatment options to the hemophilia community.”
The multi-year clinical development of HEMGENIX was led by uniQure (Nasdaq: QURE) and sponsorship of the clinical trials transitioned to CSL after it licensed global rights to commercialize the treatment. Additionally, CSL established a post-marketing registry, which will be informative to all stakeholders and will generate additional evidence on the long-term safety, efficacy, and durability of gene therapy. HEMGENIX has also been granted conditional marketing authorization by the European Commission (EC) for the European Union and European Economic Area, the UK’s Medicines and Healthcare products Regulatory Agency (MHRA), as well as authorization by Health Canada, Switzerland’s Swissmedic and provisional approval by Australia’s Therapeutic Goods Administration (TGA).
For more information on HEMGENIX, please visit www.Hemgenix.com.
About the Pivotal HOPE-B Trial The pivotal Phase III HOPE-B trial is an ongoing, multinational, open-label, single-arm study to evaluate the safety and efficacy of HEMGENIX. Fifty-four adult hemophilia B patients classified as having moderately severe to severe hemophilia B and requiring prophylactic factor IX replacement therapy were enrolled in a prospective, six-month or longer observational period during which time they continued to use their current standard of care therapy to establish a baseline Annual Bleeding Rate (ABR). After at least the six-month lead-in period, patients received a single intravenous administration of HEMGENIX at a 2×10^13 gc/kg dose. Patients were not excluded from the trial based on pre-existing neutralizing antibodies (NAbs) to AAV5.
A total of 54 patients received a single dose of HEMGENIX in the pivotal trial, with 51 patients completing at least four years of follow-up. The primary endpoint in the pivotal HOPE-B study was ABR 52 weeks after achievement of stable factor IX expression (months 7 to 18) compared with the six-month lead-in period. For this endpoint, ABR was measured from month seven to month 18 after infusion, ensuring the observation period represented a steady-state factor IX transgene expression. Secondary endpoints included assessment of factor IX activity.
No serious treatment-related adverse reactions were reported. One death resulting from urosepsis and cardiogenic shock in a 77-year-old patient at 65 weeks following dosing was considered unrelated to treatment by investigators and the sponsor company. A serious adverse event of hepatocellular carcinoma was determined to be unrelated to treatment with HEMGENIX by independent molecular tumor characterization and vector integration analysis. No inhibitors to factor IX were reported.
About Hemophilia B Hemophilia B is a life-threatening rare disease caused by a mutation on the F9 gene, resulting in low levels of functional clotting factor IX. People with the condition are particularly vulnerable to bleeds in their joints, muscles, and internal organs, leading to pain, swelling, and joint damage. Treatments for moderate to severe hemophilia B typically include life-long prophylactic infusions of factor IX to temporarily replace or supplement low levels of the blood-clotting factor.
About HEMGENIX® HEMGENIX is a gene therapy that reduces the rate of abnormal bleeding in eligible people with hemophilia B by enabling the body to continuously produce factor IX, the deficient protein in hemophilia B. It uses AAV5, a non-infectious viral vector, called an adeno-associated virus (AAV). The AAV5 vector carries the Padua gene variant of Factor IX (FIX-Padua) to the target cells in the liver, generating factor IX proteins that are 5x-8x more active than normal. These genetic instructions remain in the target cells, but generally do not become a part of a person’s own DNA. Once delivered, the new genetic instructions allow the cellular machinery to produce stable levels of factor IX.
Important Safety Information(ISI)
What is HEMGENIX®? HEMGENIX®, etranacogene dezaparvovec-drlb, is a one-time gene therapy for the treatment of adults with hemophilia B who:
Currently use Factor IX prophylaxis therapy, or
Have current or historical life-threatening bleeding, or
Have repeated, serious spontaneous bleeding episodes.
HEMGENIX is administered as a single intravenous infusion and can be administered only once.
What medical testing can I expect to be given before and after administration of HEMGENIX? To determine your eligibility to receive HEMGENIX, you will be tested for Factor IX inhibitors. If this test result is positive, a retest will be performed 2 weeks later. If both tests are positive for Factor IX inhibitors, your doctor will not administer HEMGENIX to you. If, after administration of HEMGENIX, increased Factor IX activity is not achieved, or bleeding is not controlled, a post-dose test for Factor IX inhibitors will be performed.
HEMGENIX may lead to elevations of liver enzymes in the blood; therefore, ultrasound and other testing will be performed to check on liver health before HEMGENIX can be administered. Following administration of HEMGENIX, your doctor will monitor your liver enzyme levels weekly for at least 3 months. If you have preexisting risk factors for liver cancer, regular liver health testing will continue for 5 years post-administration. Treatment for elevated liver enzymes could include corticosteroids.
What were the most common side effects of HEMGENIX in clinical trials? In clinical trials for HEMGENIX, the most common side effects reported in more than 5% of patients were liver enzyme elevations, headache, elevated levels of a certain blood enzyme, flu-like symptoms, infusion-related reactions, fatigue, nausea, and feeling unwell. These are not the only side effects possible. Tell your healthcare provider about any side effect you may experience.
What should I watch for during infusion with HEMGENIX? Your doctor will monitor you for infusion-related reactions during administration of HEMGENIX, as well as for at least 3 hours after the infusion is complete. Symptoms may include chest tightness, headaches, abdominal pain, lightheadedness, flu-like symptoms, shivering, flushing, rash, and elevated blood pressure. If an infusion-related reaction occurs, the doctor may slow or stop the HEMGENIX infusion, resuming at a lower infusion rate once symptoms resolve.
What should I avoid after receiving HEMGENIX? Small amounts of HEMGENIX may be present in your blood, semen, and other excreted/secreted materials, and it is not known how long this continues. You should not donate blood, organs, tissues, or cells for transplantation after receiving HEMGENIX.
You are encouraged to report negative side effects of prescription drugs to the FDA. Visit www.fda.gov/medwatch, or call 1-800-FDA-1088.
You can also report side effects to CSL Behring’s Pharmacovigilance Department at 1-866-915-6958.
About CSL CSL (ASX:CSL; USOTC:CSLLY) is a global biotechnology company with a dynamic portfolio of lifesaving medicines, including those that treat haemophilia and immune deficiencies, vaccines to prevent influenza, and therapies in iron deficiency and nephrology. Since our start in 1916, we have been driven by our promise to save lives using the latest technologies. Today, CSL – including our three businesses: CSL Behring, CSL Seqirus and CSL Vifor – provides lifesaving products to patients in more than 100 countries and employs 32,000 people. Our unique combination of commercial strength, R&D focus and operational excellence enables us to identify, develop and deliver innovations so our patients can live life to the fullest. For inspiring stories about the promise of biotechnology, visit CSL.com/Vita and follow us on Twitter.com/CSL.
Today, under the direction of the Acting Chairman, Staff Legal Bulletin 14L is now rescinded by the issuance of Staff Legal Bulletin 14M (“SLB 14M”). SLB 14M moves the goalposts smack dab in the middle of this year’s shareholder proposal process. Doing so at this hour creates undue costs and uncertainty for investors and corporations alike. This type of political policy shifting mid-season serves to undercut capital formation, not facilitate it.
SLB 14M implements different rules of the road for the process of excluding shareholder proposals from issuers’ proxy statements.[1] Such proposals include topics relating to poison pills, compensation, emerging issues such as AI, political and lobbying expenditures, and environmental or other issues that shareholders have identified as materially impacting the firm’s financial value.[2] The fact that the change is taking place at this time is significant. As anyone familiar with the shareholder proposal process knows, excluding a proposal from the proxy statement all but guarantees it will never make it to a shareholder vote.
The rescission comes as no surprise given that the shareholder proposal process has become the target of politicized messaging and a preferred punching bag of those who wish to diminish corporate democracy. This is the case even though there are already numerous other mechanisms in place to limit the availability of the proxy ballot to shareholders.[3] Though the shareholder proposal process is designed merely to facilitate a dialogue with investors, today’s actions drowns out investor voices and facilitates corporate monologues instead.[4]
Even though the rescission may not be a surprise, the timing of this action is arbitrary and inequitable. Shareholders have already crafted and submitted their proposals for this season. Corporations and shareholders will incur costs to supplement or alter no-action requests and responses. Further, SEC staff have already issued no-action letter responses related to proposals for this proxy season. Even for those stakeholders and observers who prefer a different approach to this process, the end result is quite possibly disparate treatment not just for shareholders, but for issuers as well. We are so focused on undoing the prior Commission’s agenda that we sow chaos now. By choosing this path, we forsake all consistency, and perhaps even the legitimacy, of the independent, historically staff-governed process to the detriment of all parties.
While costly and confusing, corporations will still have a chance to revise their no-action requests to exclude proposals. Shareholders, of course, will have no such opportunity. The 14a-8 no-action process is fact-intensive, and exactly how a proposal is crafted is often determinative of its exclusion or inclusion. It is now too late for most shareholders to design proposals in line with SLB 14M.
Instead of taking a measured approach that would ensure market stability and a meaningful consideration of cost and benefit, this leadership has rushed out staff guidance for the sake of political expediency, and at significant cost to shareholders, corporations, and SEC staff resources.
[1] SLB 14M rescinds Staff Legal Bulletin No. 14L and, in large part, reinstates previous guidance on staff views relating to the (i)(5) and (i)(7) substantive bases for exclusion. See Staff Legal Bulletin No. 14M. It is important to note that (i)(7) was the most often used exclusion in the 2024 proxy season. See Merel Spierings, the Conference Board, 2024 Proxy Season Review: Corporate Resilience in a Polarized Landscape, H. L. School Forum on Corp. Gov. (Oct. 12, 2024).
[3] For example, shareholders must meet certain ownership and resubmission thresholds to submit a proposal, and proposals are subject to a 500 word limit. See CFR 240.14a-8(b)(1), (d), & (i)(12).
[4] Additionally, shareholder proposals are precatory, or merely advisory, in nature. See CFR 240.14a-8(i)(1).
Source: United Kingdom – Executive Government & Departments
A study published in JAMA Psychiatry looks at the use of semaglutide in adults with alcohol use disorder.
Dr Riccardo De Giorgi, Clinical Lecturer at the Department of Psychiatry, University of Oxford, said:
“There has been much sensation (and even more noise) about GLP-1 drugs such as semaglutide in the medical field, especially regarding mental health. However, their potential use as a mechanistically novel treatment for addiction is perhaps one of the most promising research avenues. This investigator-initiated phase 2 randomised, placebo-controlled trial was small (48 people) but sound and well-designed. It looked at several outcomes of importance to alcohol misuse. It represents, at present, the most robust and yet preliminary piece of evidence suggesting that these medications may indeed be useful for the care of people with alcohol use disorder – an extremely disabling condition. Semaglutide appeared to be safe and well-tolerated, though it should be noted that the administered dose was not large (0.5mg) and it was given over a relatively short period of time (8 weeks). This is the kind of study of which we need to see more if we are to see progress in this key research area.”
Prof Matt Field, Professor of Psychology, University of Sheffield, said:
“Some recent research suggests that semaglutide can reduce alcohol consumption in people with alcohol use disorders. Those studies were observational, which means it is difficult to attribute the reduction in alcohol consumption to semaglutide rather than to confounding factors. The present study overcomes these limitations by randomising adults with alcohol use disorder to receive weekly injections of either low-dose semaglutide or placebo over 9 weeks. Participants recorded how much alcohol they drank over this period, and they also completed laboratory sessions at the beginning and end of the study period in which they could consume alcoholic drinks. The research team found that, compared to the placebo group, the group who had received semaglutide drank significantly less alcohol in the lab. Furthermore, although the semaglutide and placebo groups did not differ in how often they drank alcohol during the study period (outside the lab), on days when they did drink alcohol the semaglutide group drank less alcohol than the placebo group.
“Overall, this randomised study goes beyond previous observational studies which tended to look at people who were prescribed semaglutide for other reasons (usually diabetes) and evaluate how the drug affected their alcohol consumption. With those types of observational studies, it is difficult to know if any effects on alcohol consumption were attributable to the drug or to confounding factors. This study overcomes those limitations by demonstrating, for the first time, a causal effect of semaglutide on the amount of alcohol that people drink. This study will hopefully serve as a springboard for further research. Furthermore, the nature of the semaglutide effect (reducing the amount of alcohol consumed, whilst having no effect on the number of days that people drank alcohol) is consistent with the idea that semagludide reduces the reward or pleasure that people get from drinking alcohol, which is why they drink less.
“Some limitations of the study include the characteristics of the sample, who were not seeking treatment and were not motivated to reduce their alcohol consumption or stop drinking. Most new treatments for alcohol use disorder are evaluated in people who ask for treatment because they want help to stop drinking altogether or reduce their drinking, so it will be important to test the effects of semaglutide on people with these characteristics. A cautionary tale is that, when promising medications are tested in people with alcohol use disorder who are trying to cut down their drinking, we often see a large placebo response (i.e. a reduction in drinking among people taking placebo), which can obscure any additional effect of the drug. Other considerations are that participants in this study had a body mass index (BMI) of at least 23, and most had a BMI of 30 or higher (which is in the obese range). It will be important to establish if semaglutide can also reduce alcohol consumption in people who are not obese, particularly given that many people who seek treatment for alcohol problems are underweight. This study had a small sample size and a short follow-up period, so it will be important to see if the effects of semaglutide are maintained over a longer time period, and, crucially, what happens when people stop taking the medication. It will also be important to consider if and how semaglutide can be incorporated into conventional treatment for alcohol use disorder which might include detoxification, counselling or talking therapies, other types of medications, and involvement with mutual aid groups such as alcoholics anonymous.”
‘Once-Weekly Semaglutide in Adults With Alcohol Use Disorder: A Randomized Clinical Trial’ byChristian S. Hendershotet al.was published in JAMA Psychiatry at 16:00 UK time on Wednesday 12 February 2025.
DOI: 10.1001/jamapsychiatry.2024.4789
Declared interests
Dr Riccardo De Giorgi: “I am supported by the NIHR Oxford Health Biomedical Research Centre and currently conduct research on GLP-1 medications (NIHR OH BRC funded; no industry or any other kind of funding).”
Prof Matt Field: “I have no conflicts of interest to declare.”
American president Donald Trump has issued an executive order to withdraw aid from South Africa. He was reacting to what he has called the South African government’s plan to “seize ethnic minority Afrikaners’ agricultural property without compensation”. Afrikaners are an ethnic and linguistic community of white South Africans whose home language is Afrikaans.
Trump’s action, amplified by provocative comments from billionaire Elon Musk, has reignited debate about the concept of “white victimhood”. We asked Nicky Falkof, who has researched the idea of white victimhood, for her insights.
What does ‘white victimhood’ mean?
White victimhood refers to a powerful set of beliefs that treats white people as special and different, but also as uniquely at risk. Within this narrative white people see themselves, and are sometimes seen by others, as extraordinary victims, whose exposure to violence or vulnerability is more concerning and important than anyone else’s.
White victimhood is usually speculative. It relates not to actual events that have happened, but to white people’s feelings of being threatened or unsafe. Entire political agendas develop around the idea that white people must be protected because they face exceptional threats, which are not being taken seriously by a contemporary world order that fails to value whiteness.
This is by no means particular to South Africa; we see it wherever whiteness is predominant. Indeed, ideas about white victimhood play a significant role in the popularity of Trump, whose call to “make America great again” harks back to an idealised past where white people (particularly men) could easily dominate the nation, the workplace and the home.
The South African case is important because it plays a central role in global white supremacist claims. These mythologies claim that white South Africans, specifically Afrikaners, are the canary in the coalmine: that the alleged oppression they are facing is a blueprint for what will happen to all white people if they don’t “fight back”.
What is its history?
We can trace this idea back to the start of the colonial project. In 1660 Dutch East India Company administrator Jan van Riebeeck planted a hedge of bitter almond shrubs to separate his trading station from the rest of South Africa’s Cape. This hedge was part of a defensive barrier intended to keep indigenous people out of the Dutch trading post, which had been built on top of ancient Khoikhoi grazing routes.
On a practical level, van Riebeeck’s hedge was meant to shield Dutch settlers and livestock from Khoikhoi raiders. On a philosophical level, the hedge situated the invaders as the “real” victims, who desperately needed protection from the violence and wildness of Africa. The bitter almond hedge is still seen as an enduring symbol of white supremacy in the country.
This early paranoia and securitisation has had a significant effect on white South African culture and anxiety. White people who can afford to do so barricade themselves in gated communities and boomed-off suburban streets, behind high walls topped with razor wire, on the assumption that they are the primary victims of South Africa’s crime rate.
In what ways has victimhood been used over the centuries or decades?
Ideas about white victimhood have played a role in many of South Africa’s most influential social formations.
The 1930s saw a major panic around “poor whites”, which led to commissions of inquiry, upliftment programmes and other attempts at social engineering. The people and institutions behind these initiatives weren’t concerned about poverty in South Africa in general, even though it was becoming more of a problem as the population urbanised. Their only interest was in poverty among white people, drawing on the assumption that it’s wrong or abnormal for white people to be poor, and that this needed to be urgently remedied.
These moves were not simply about philanthropy and offering better life chances to poor people; they were about protecting the boundaries of whiteness. Poor whites were seen as a threat to the establishment because they proved that whiteness wasn’t inherently superior.
More recently, the victimhood narrative has been a central part of the panic around farm murders and claims of “white genocide”, an old idea that has been popularised and spread online.
Rural violence is a huge problem in South Africa that deserves a strong response. But white people are far from its only casualties. Indeed, violent crime affects pretty much everyone in South Africa. When the deaths of white people are explained as part of a targeted genocide undertaken on the basis of race, the message is that they matter more than the deaths of everyone else.
Again, this suggests a kind of naturalisation of violence and harm. When terrible things happen to other people they simply happen and are not remarked on. It’s only when white people are affected that they become a pressing issue.
Has it helped white South Africans? Has it been effective as a mobilising tool?
White victimhood, like the racial anxiety it is part of, is not good for white people. It doesn’t keep them safer or help them to live better lives.
That said, it’s been quite effective as a mobilising tool. The apartheid-era National Party was skilled at using white fear for political gain. Its communications constantly played on white fears of the swart gevaar, the “black danger”, which encapsulated the powerful belief that whites were more at risk from black people than vice versa, despite all evidence to the contrary.
Similarly, contemporary organisations like the Afrikaner “minority rights” pressure group AfriForum and the Afrikaans trade union Solidarity activate and manipulate white people’s senses of extraordinary victimhood. This drives them further into a defensive position, where everything from farm murders and road name changes to the National Health Insurance bill is designed to attack them personally.
White support for these kinds of organisations and the political positions they espouse, whether overtly or covertly, is at least in part driven by the effective manipulation of white victimhood.
How effective is it still?
It remains disturbingly powerful. The architecture of white supremacy depends on the idea that white people are extraordinary victims. This is the driving notion beneath the great replacement theory, a far-right conspiracy theory claiming that Jews and non-white foreigners are plotting to “replace” whites. It also underpins violent reactions to the global migration crisis and the rise of populism in the north.
I don’t think it’s going too far to say that whiteness as a social construction is intrinsically tied to victimhood. The idea that whiteness actually makes people more rather than less vulnerable is likely to remain a central part of white people’s collective psychic imaginary for some time.
Nicky Falkof receives funding from the South African National Research Foundation.
With the endless stream of announcements, reversals, measures and countermeasures coming from the new administration of United States President Donald Trump, it has become difficult to make sense of what is just noise or opening negotiation offers and what constitutes actual policy change.
Unfortunately, in the case of the global response against HIV/AIDS, it seems the attacks go beyond bluster.
The methods used in the fight against HIV/AIDS have long been disputed, but overall commitment to the response was one of the few deeply bipartisan endeavours left, until now. Undercutting this decades-long consensus would mean endangering millions of lives.
U.S. role in global HIV/AIDS response
As a PhD candidate in international relations working on the politics of the response to HIV/AIDS, I am very aware of the central role that the U.S. has played in building and sustaining a global response to the epidemic in the past 25 years.
The U.S. is also a fundamental participant in HIV/AIDS research, including through the work of the Centers for Disease Control (CDC) and the National Institutes of Health (NIH), as well as USAID.
All of this involvement has already been dangerously jeopardized by the actions taken by the White House since Trump took office for his second term.
The chaos wrought by these measures has impacted the response to HIV/AIDS in deep ways, even if they may be contested or reversed by the courts and Congress.
The uncertainty in itself is damaging for programs that need reliable funding and long-term planning, not to mention the clinical trials that have been brutally interrupted. What’s more, there are indications the Trump administration and other Republicans have abandoned the longstanding commitment to the response itself, which may lead to irreparable damage.
American involvement in the global response to HIV/AIDS has long been shaped by domestic politics. Most notably, PEPFAR’s first rounds of funding were deeply constrained by the views of George W. Bush’s evangelical constituency, including in its focus on abstinence as prevention and denial of funding for sex workers.
The Heritage Foundation, the conservative think-tank behind the potential blueprint for Trump’s government known as Project 2025, has referred to HIV/AIDS as a lifestyle disease, like tobacco consumption. This language is reminiscent of the 1980s playbook of opponents on AIDS action and negates both the nature of the epidemic and the realities of those who live with the virus, casting doubts on the need to engage meaningfully with the response.
Most ominously, the last reauthorization of PEPFAR in 2024 was limited to one year instead of the customary five, as some Republican representatives sought to end it altogether. This means the entire program is to be re-examined this March with no guarantee of how the debates will unfold, especially in the current climate.
Ultimately most will depend on Congress, including the amount pledged by the U.S. to the Global Fund at its replenishment conference sometime this year.
Its decisions will be the real test of the depth of change on this matter, though everything that has unfolded so far hints at a far-reaching shattering of the consensus. If conservative Republicans maintain their pressure on PEPFAR, the program could be significantly diminished, and it is unlikely that a White House that withdrew from the World Health Organization on day one will act decisively to save it or insist on a sustained contribution to the Global Fund.
Consequences of U.S. disengagement
The consequences of a U.S. retreat from the global response to HIV/AIDS would be immense.
In the short-term, millions of people would lose access to the treatment they depend on for their survival. In the long term, shrinking American funding would undermine health systems around the world and risk the resurgence of the pandemic and the rise of resistant virus strains.
This would jeopardize 40 years of progress, returning us to a time when AIDS was considered a key security risk and threat to development.
Even if funding is maintained, all of this shows that for the next few years the U.S. is unlikely to be reliable. This means others will have to take up the leadership to ensure the worst-case scenario is avoided.
Among these, Canada could have a crucial role to play. It has long been a key entity in its own right — the seventh largest contributor to the Global Fund — though Ottawa has remained discreet in this area so far. Washington’s withdrawal from the field may force it to step into a more visible role and contribute to reframe Canada’s international involvement.
Yolaine Frossard de Saugy does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
American president Donald Trump has issued an executive order to withdraw aid from South Africa. He was reacting to what he has called the South African government’s plan to “seize ethnic minority Afrikaners’ agricultural property without compensation”. Afrikaners are an ethnic and linguistic community of white South Africans whose home language is Afrikaans.
Trump’s action, amplified by provocative comments from billionaire Elon Musk, has reignited debate about the concept of “white victimhood”. We asked Nicky Falkof, who has researched the idea of white victimhood, for her insights.
What does ‘white victimhood’ mean?
White victimhood refers to a powerful set of beliefs that treats white people as special and different, but also as uniquely at risk. Within this narrative white people see themselves, and are sometimes seen by others, as extraordinary victims, whose exposure to violence or vulnerability is more concerning and important than anyone else’s.
White victimhood is usually speculative. It relates not to actual events that have happened, but to white people’s feelings of being threatened or unsafe. Entire political agendas develop around the idea that white people must be protected because they face exceptional threats, which are not being taken seriously by a contemporary world order that fails to value whiteness.
This is by no means particular to South Africa; we see it wherever whiteness is predominant. Indeed, ideas about white victimhood play a significant role in the popularity of Trump, whose call to “make America great again” harks back to an idealised past where white people (particularly men) could easily dominate the nation, the workplace and the home.
The South African case is important because it plays a central role in global white supremacist claims. These mythologies claim that white South Africans, specifically Afrikaners, are the canary in the coalmine: that the alleged oppression they are facing is a blueprint for what will happen to all white people if they don’t “fight back”.
What is its history?
We can trace this idea back to the start of the colonial project. In 1660 Dutch East India Company administrator Jan van Riebeeck planted a hedge of bitter almond shrubs to separate his trading station from the rest of South Africa’s Cape. This hedge was part of a defensive barrier intended to keep indigenous people out of the Dutch trading post, which had been built on top of ancient Khoikhoi grazing routes.
On a practical level, van Riebeeck’s hedge was meant to shield Dutch settlers and livestock from Khoikhoi raiders. On a philosophical level, the hedge situated the invaders as the “real” victims, who desperately needed protection from the violence and wildness of Africa. The bitter almond hedge is still seen as an enduring symbol of white supremacy in the country.
This early paranoia and securitisation has had a significant effect on white South African culture and anxiety. White people who can afford to do so barricade themselves in gated communities and boomed-off suburban streets, behind high walls topped with razor wire, on the assumption that they are the primary victims of South Africa’s crime rate.
In what ways has victimhood been used over the centuries or decades?
Ideas about white victimhood have played a role in many of South Africa’s most influential social formations.
The 1930s saw a major panic around “poor whites”, which led to commissions of inquiry, upliftment programmes and other attempts at social engineering. The people and institutions behind these initiatives weren’t concerned about poverty in South Africa in general, even though it was becoming more of a problem as the population urbanised. Their only interest was in poverty among white people, drawing on the assumption that it’s wrong or abnormal for white people to be poor, and that this needed to be urgently remedied.
These moves were not simply about philanthropy and offering better life chances to poor people; they were about protecting the boundaries of whiteness. Poor whites were seen as a threat to the establishment because they proved that whiteness wasn’t inherently superior.
More recently, the victimhood narrative has been a central part of the panic around farm murders and claims of “white genocide”, an old idea that has been popularised and spread online.
Farmers and supporters protest against farm murders outside the South African parliament in 2020.Jacques Stander/Gallo Images via Getty Images
Rural violence is a huge problem in South Africa that deserves a strong response. But white people are far from its only casualties. Indeed, violent crime affects pretty much everyone in South Africa. When the deaths of white people are explained as part of a targeted genocide undertaken on the basis of race, the message is that they matter more than the deaths of everyone else.
Again, this suggests a kind of naturalisation of violence and harm. When terrible things happen to other people they simply happen and are not remarked on. It’s only when white people are affected that they become a pressing issue.
Has it helped white South Africans? Has it been effective as a mobilising tool?
White victimhood, like the racial anxiety it is part of, is not good for white people. It doesn’t keep them safer or help them to live better lives.
That said, it’s been quite effective as a mobilising tool. The apartheid-era National Party was skilled at using white fear for political gain. Its communications constantly played on white fears of the swart gevaar, the “black danger”, which encapsulated the powerful belief that whites were more at risk from black people than vice versa, despite all evidence to the contrary.
Similarly, contemporary organisations like the Afrikaner “minority rights” pressure group AfriForum and the Afrikaans trade union Solidarity activate and manipulate white people’s senses of extraordinary victimhood. This drives them further into a defensive position, where everything from farm murders and road name changes to the National Health Insurance bill is designed to attack them personally.
White support for these kinds of organisations and the political positions they espouse, whether overtly or covertly, is at least in part driven by the effective manipulation of white victimhood.
How effective is it still?
It remains disturbingly powerful. The architecture of white supremacy depends on the idea that white people are extraordinary victims. This is the driving notion beneath the great replacement theory, a far-right conspiracy theory claiming that Jews and non-white foreigners are plotting to “replace” whites. It also underpins violent reactions to the global migration crisis and the rise of populism in the north.
I don’t think it’s going too far to say that whiteness as a social construction is intrinsically tied to victimhood. The idea that whiteness actually makes people more rather than less vulnerable is likely to remain a central part of white people’s collective psychic imaginary for some time.
– Trump and South Africa: what is white victimhood, and how is it linked to white supremacy? – https://theconversation.com/trump-and-south-africa-what-is-white-victimhood-and-how-is-it-linked-to-white-supremacy-249648
Time to ‘Celebrate Our Heritage’ at Strabane St Patrick’s Day Parade
12 February 2025
Final preparations are underway to make this year’s St Patrick’s Day parade in Strabane bigger and better than ever.
The theme for this year’s parade is ‘Celebrating Our Heritage’ and over the last few months local schools, clubs, community groups, bands and individuals have been working hard creating eye-catching costumes and props, and practicing their dances and tunes in readiness for the 17th March.
Schools taking part in this year’s parade include St Catherine’s PS, Holy Cross College, Sion Mills Integrated PS, Knockavoe School, and Gaelscoil Ui Dhochartaigh. Among the groups who will be participating are Sion Swifts, Sigersons GAA Club, Niamh Brown McGranaghan School of Irish Dance, Much Ado Performing Arts Academy and Class Act Theatre Group.
Preparing the young people to step out with confidence on St Patrick’s Day are Streetwise Community Circus and the North West Carnival Initiative. Streetwise have been working with the local schoolchildren to teach them a variety of circus skills including juggling and stilt walking, they have also been guiding them in the intricacies of prop design. Around 120 children from local schools will take part in the parade, each will carry a prop they have created especially for the occasion.
The North West Carnival Initiative have been working with the local sports clubs and dance/drama groups in preparation for their part in the day. They have been working with the groups to help them build props, costumes and banners which will be showcased during the parade.
Providing music on the day will be a number of talented local bands.
Encouraging people to come out and enjoy the fabulous St Patrick’s Day Parade, the Mayor of Derry City and Strabane District Council, Cllr Lilian Seenoi Barr said: “We’ve all had enough of the cold, dark days of winter and we are ready to welcome the warmer days of Spring – what better way to greet the new season than with an incredible St Patrick’s Day Parade full of fun, colour, music and dance.
“I would encourage everyone in Strabane to come out and celebrate our wonderful heritage and traditions with this special day. Please give your support to all the young people and individuals who have worked so hard to create this wonderful event for you to enjoy. I can guarantee even if the sun doesn’t shine that you’ll have a smile on your face!”.
This year’s parade will depart from Holy Cross College at 2pm, it will make its way down the Melmount Road, along Bridge Street and Market Street, past Abercorn Square and along Railway Road before finishing at Dock Street.
There will be activity in the Alley Theatre from 1.30-4.30pm with live music from CRAIC, face painting and Barry McGowan Art.
Later that evening the Strabane Drama Festival will continue at the Alley Theatre with The Whiteheaded Boy by Lennox Robinson presented by the Bart Players. Tickets for this performance and further information about the Drama Festival is available at www.alley-theatre.com.
Full details of the Strabane St Patrick’s Day celebrations are available at www.derrystrabane.com/stpatricksdaystrabane and follow St Patrick’s Day Strabane on Facebook for all the latest information.
The Women’s Rugby World Cup trophy will be the star attraction at Sunderland AFC’s pre-match fan zone next month.
The trophy’s appearance is part of a rugby takeover of the fan zone on International Women’s Day, on Saturday 8 March.
Families coming along to the fan zone at the Beacon of Light ahead of the Sunderland v Cardiff City match will be able to try their hand at a whole range of exciting rugby inspired activities on the day.
Councillor Beth Jones, Cabinet Member for Communities, Culture and Tourism at Sunderland City Council, said: “We’re thrilled to have secured the Women’s Rugby World Cup trophy for our fan zone takeover on International Women’s Day.
“With just months to go until England’s Red Roses kick off the opening match of the Women’s Rugby World Cup at the Stadium of Light on Friday 22 August, the fan zone event is a great opportunity to showcase everything rugby has to offer.
“Even if you don’t know anything about the sport, it’s a fantastic way to immerse yourself in all things rugby.
“There’ll be something for everyone no matter what your age or ability, including walking rugby, fun fitness sessions with a rugby twist, children’s activities, tag rugby, and rugby skills on show from local clubs, as well as the chance to hear about the new T1 rugby offer coming soon to the city.
“So this is a brilliant chance to come along and find out all about our Active Sunderland community rugby offer and learn more about our fantastic local rugby clubs. You’ll also be able to find out how to get tickets for the England v USA opening match. And, you can even have your photo taken with the Women’s Rugby World Cup trophy.”
The Beacon of Light will be hosting the fan zone take over from 12.30-2.30pm on Saturday 8 March, with match-goers and non match-goers alike welcome to come along and join the fun. All activities are free.
Match-goers will also be able to see girls from Houghton Rugby Club’s under 12’s team demonstrating their rugby skills when they take to the pitch at the Stadium of Light at half time during the Cardiff City game.
The fan zone takeover is being organised by the RFU, University of Sunderland, local rugby clubs, the Foundation of Light, SAFC, Sunderland BID, Newcastle Falcons and Sunderland City Council.
Source: The Conversation – USA – By Christopher Wolff, Associate Professor of Anthropology, University at Albany, State University of New York
Leola One Feather of the Oglala Sioux Tribe observes as Native American artifacts are photographed at the Founders Museum in Barre, Mass., in 2022, before their return.AP Photo/Philip Marcelo
As an archaeologist, you picture yourself traveling to some remote location, digging into the ground, and returning to a lab in a university or museum to study the remains of past civilizations, with hopes of answering important questions.
In contrast, I’ve often found myself working to return those remains to their rightful cultures. Repatriation is the process of returning ancestral human remains and important objects to descendant populations. Since the passing of the National Museum of the American Indian Act in 1989 and the Native American Graves Protection and Repatriation Act in 1990, it has become an increasingly important part of archaeological practice, yet about 110,000 ancestors remain in collections.
This work is about more than legal obligations. To many researchers such as myself, it is a matter of human rights.
When first enacted, these laws were controversial among archaeologists. Much of this anxiety stemmed from worries about losing access to research opportunities. Some concerns were shaped by legal battles surrounding the remains of “Kennewick Man,” whom Indigenous people refer to as the “Ancient One.” This man’s remains were found in Washington state in 1996 and dated to over 8,000 years ago. Scientists won the legal right to study them, in opposition to local tribal nations’ requests, until a 2016 law returned the remains of the individual to those groups.
Over time, many archaeologists have seen that while repatriation requirements limit research in some ways, in others they have been beneficial and improved aspects of archaeologists’ relationships with Indigenous communities.
This is not an idea I was exposed to as a graduate student. Like many others in my field, I had virtually no exposure to the actual process of repatriation, even more than a decade after the Native American Graves Protection and Repatriation Act, called NAGPRA, was signed into law. Rather, it is one that developed while I served as a repatriation archaeologist for the Smithsonian National Museum of Natural History from 2009-2011, and in the following years as a professor of archaeology.
Dancers from the Haida Tribe perform at the Field Museum in Chicago in 2003, celebrating the return of Haida human remains to their descendants. AP Photo/M. Spencer Green
Careful process
Repatriation includes important steps that are required by law, as well as other ethical considerations. First, any human remains or objects that fall within certain categories – such as sacred objects, or funerary objects – should be stored where they can be properly cared for with respect. For instance, Indigenous groups may ask that tobacco be placed with the remains, as an offering to their ancestors’ spirits.
Researchers must compile information about these human remains into an itemized list containing the number of individuals and objects, brief descriptions of them, where they were found, and how they came into the institution’s possession. This list is then provided to representatives of communities that may be descendants, or possible living relatives.
If those communities decide to request the remains’ return, then the formal process of assessing “cultural affiliation” begins. This is a thorough analysis of any evidence demonstrating a connection between the remains or objects and a particular group today. Evidence can include many things, including physical characteristics of the human remains or objects, written documents, oral history, or distinct cultural attributes of the artifacts.
Legally, this process is required only for federally recognized Indigenous groups. However, institutions can choose to apply the same consideration to other communities if they believe it is appropriate, such as the hundreds of Indigenous groups that lack federal recognition.
The analysis is officially submitted to the national NAGPRA database, and a public notice is posted so that other interested parties could potentially make a claim on the remains or objects.
If researchers confirm there is a cultural affiliation, after a 90-day waiting period an official repatriation statement is filed with the national office. Researchers then consult with the requesting parties about how to conduct the physical return. What happens next is in the hands of the affiliated groups, and their wishes must be accommodated.
Kurt Riley, then the governor of the Pueblo of Acoma, speaks at the Smithsonian National Museum of the American Indian in 2016, protesting a French auction house’s plans to sell Indigenous artifacts. AP Photo/Andrew Harnik
Unfortunately, many remains have already suffered significant damage by the time repatriation begins. A great many of them have sat on shelves unstudied, sometimes for decades or longer – even those that came into the collection legally and in collaboration with Indigenous groups.
Powerful moment
One such individual was the key to a major shift in how I viewed repatriation – no longer as a research hindrance but as a question of human rights. Out of respect for the Indigenous nation, I cannot discuss specifics – only a broader picture of this “aha” moment.
One day at work, I found myself looking at an individual who had died several centuries ago, but was so well preserved that his death looked much more recent. It can be too easy to look at a collection of human bones and forget that they were once a living person, despite trying to teach students otherwise. However, that day I looked down and clearly saw a man: his face painted, his hair neatly done, earrings in his ears, laid out in a beautiful box.
Obviously, whoever tended to him after his death had taken great care, placing him in a sacred place where he had every expectation that he would be left undisturbed. He could not have perceived that centuries later someone would collect his remains and ship him away from his traditional lands to be studied in a museum.
That hit home for me. I would not want someone to go against my final wishes, or those of my family, and felt this man should have the same human rights I have in that regard.
I regret it took me so long to see that. Ever since, I’ve worked hard to make up for that by teaching my students to see the past full of people with expectations, hopes and emotions, and to extend ethical obligations to them as we would want applied to us. Archaeology is about learning from the past, and working in repatriation and meeting this individual provided me with one of the best lessons of my career.
Christopher Wolff does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
An African journalist films President Xi Jinping delivering an opening ceremony speech for the China-Africa forum in Beijing in September 2024.AP Photo/Andy Wong
Every year, China’s minister of foreign affairs embarks on what has now become a customary odyssey across Africa. The tradition began in the late 1980s and sees Beijing’s top diplomat visit several African nations to reaffirm ties. The most recent visit, by Foreign Minister Wang Yi, took place in mid-January 2025 and included stops in Namibia, the Republic of the Congo, Chad and Nigeria.
For over two decades, China’s burgeoning influence in Africa was symbolized by grand displays of infrastructural might. From Nairobi’s gleaming towers to expansive ports dotting the continent’s shorelines, China’s investments on the continent have surged, reaching over US$700 billion by 2023 under the Belt and Road Initiative, China’s massive global infrastructure development strategy.
But in recent years, Beijing has sought to expand beyond roads and skyscrapers and has made a play for the hearts and minds of African people. With a deft mix of persuasion, power and money, Beijing has turned to African media as a potential conduit for its geopolitical ambitions.
Partnering with local outlets and journalist-training initiatives, China has expanded China’s media footprint in Africa. Its purpose? To change perceptions and anchor the idea of Beijing as a provider of resources and assistance, and a model for development and governance.
The ploy appears to be paying dividends, with evidence of sections of the media giving favorable coverage to China. But as someone researching the reach of China’s influence overseas, I am beginning to see a nascent backlash against pro-Beijing reporting in countries across the continent.
The media charm offensive
China’s approach to Africa rests mainly on its use of “soft power,” manifested through things like the media and cultural programs. Beijing presents this as “win-win cooperation” – a quintessential Chinese diplomatic phrase mixing collaboration with cultural diplomacy.
CGTN Africa, which was set up in 2012, offers a Chinese perspective on African news. The network produces content in multiple languages, including English, French and Swahili, and its coverage routinely portrays Beijing as a constructive partner, reporting on infrastructure projects, trade agreements and cultural initiatives. Moreover, Xinhua News Agency, China’s state news agency, now boasts 37 bureaus on the continent.
By contrast, Western media presence in Africa remains comparatively limited. The BBC, long embedded due to the United Kingdom’s colonial legacy, still maintains a large footprint among foreign outlets, but its influence is largely historical rather than expanding. And as Western media influence in Africa has plateaued, China’s state-backed media has grown exponentially. This expansion is especially evident in the digital domain. On Facebook, for example, CGTN Africa commands a staggering 4.5 million followers, vastly outpacing CNN Africa, which has 1.2 million — a stark indicator of China’s growing soft power reach.
China’s zero-tariff trade policy with 33 African countries showcases how it uses economic policies to mold perceptions. And state-backed media outlets like CGTN Africa and Xinhua are central to highlighting such projects and pushing an image of China as a benevolent partner.
Questions of media veracity notwithstanding, China’s strategy is bearing fruit. A Gallup poll from April 2024 showed China’s approval ratings climbing in Africa as U.S. ratings dipped. Afrobarometer, a pan-African research organization, further reports that public opinion of China in many African countries is positively glowing, an apparent validation of China’s discourse engineering.
Further, studies have shown that pro-Beijing media influences perceptions. A 2023 survey of Zimbabweans found that those who were exposed to Chinese media were more likely to have a positive view of Beijing’s economic activities in the country.
China’s foreign minister Wang Yi, center, holds hands with his counterparts, Senegal’s Yassine Fall, left, and the Republic of the Congo’s Jean-Claude Gakosso, after a joint news conference. AP Photo/Andy Wong
Co-opting local voices
The effectiveness of China’s media strategy becomes especially apparent in the integration of local media. Through content-sharing agreements, African outlets have disseminated Beijing’s editorial line and stories from Chinese state media, often without the due diligence of journalistic skepticism.
Ethiopia exemplifies how China’s infrastructure investments and media influence have fostered a largely favorable perception of Beijing. State media outlets, often staffed by journalists trained in Chinese-run programs, consistently frame China’s role as one of selfless partnership. Coverage of projects like the Addis Ababa-Djibouti railway line highlights the benefits, while omitting reports on the substandard labor conditions tied to such projects — an approach reflective of Ethiopia’s media landscape, where state-run outlets prioritize economic development narratives and rely heavily on Xinhua as a primary news source.
Beneath the surface of China’s well-publicized projects and media offerings, and the African countries or organizations that embrace Beijing’s line, a significant countervailing force exists that challenges uncritical representations and pursues rigorous journalism.
Yet as CGTN Africa and Xinhua become entrenched in African media ecosystems, a pertinent question comes to the forefront: Will Africa’s journalists and press be able to uphold their impartiality and retain intellectual independence?
As China continues to make strategic inroads in Africa, it’s a fair question.
Mitchell Gallagher does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
Source: United Kingdom – Executive Government & Departments
Latest information and actions from the Education and Skills Funding Agency for academies, schools, colleges, local authorities and further education providers.
Source: United Kingdom – Executive Government Non-Ministerial Departments
Charity Commission CEO David Holdsworth discusses the power of philanthropy at The Beacon Philanthropy and Impact Forum 2025.
Introduction
Good afternoon, I am delighted to be here with you.
I’d like to thank the Beacon Collaborative for bringing us together today, helping us think with many minds on one, urgent challenge: how to grow the value and impact of philanthropy in our nations and around the world.
It is apt that we are meeting here at Guildhall, a place that speaks to the close relationship between commerce and charity in this city. The City Bridge Trust, administered by the Corporation of London, based here at Guildhall, made grants worth £30m to charities across the capital last year alone. Over the same period, the Lord Mayor’s Appeal, which works to encourage philanthropy in the city spent over £3m on projects designed to strengthen communities and cohesion across London.
These initiatives recognise and reflect a key facet of the social contract in this country.
Namely that with privilege and good fortune come responsibility. Our hosts, the Beacon Collaborative, put this in simple terms: “Our economy offers the freedom to create great wealth, but with reward must come responsibility.”
That responsibility is not about sacrifice or denial. It is based on an understanding that we are all part of a wider community, an ecosystem of mutual dependence and support, on whose cohesion the success of our society – and all individual wellbeing – ultimately rests.
A challenging sector landscape
The Charity Commission stands at a unique vantage point, where the perspectives of charities, government, the public and donors meet.
From this position, we see two trends.
First, an incredibly challenging economic environment for the sector.
Like other sectors, charities face inflationary pressures and rising operational costs.
But charities are also dealing with increased demands for their services.
And at the same time, public funding sources in particular are increasingly squeezed.
The cumulative impact of these trends on charities is, in some cases, extremely challenging.
Take arts and culture, a particular passion of mine. Between 2010 and 2023, grant in aid funding for UK arts and cultural organisations fell by 18%. Local government revenue funding of culture and related services have also decreased by 48% in England, and 40% in Wales.
It’s important to acknowledge that these cuts have come amid very challenging public finances, with tough choices having to be made. But the impact on the sector is undeniable.
Other sub-sectors are especially vulnerable, too.
Last summer, we learnt that one in five hospices in the UK have cut or closed their services in the last year or are planning to do so.
In October, Getting on Board, which for twenty years played a crucial role in encouraging new talent into trusteeship, announced it could no longer continue to operate.
The case for philanthropy
Our second observation, though open to some debate, is a perception that high-net worth philanthropy has declined in recent years.
To be clear, the UK remains, according to some but not all measures, among the most generous group of nations on the planet, funding a thriving and vibrant charitable sector.
In total, charities in England and Wales last year managed over £90 billion in annual income. The contribution of charity and voluntary organisations as a percentage of GDP is greater, according to some measures, than the entire agricultural sector of the UK.
But the proportion of those giving seems to be falling.
For some years, The Charities Aid Foundation – who fulfil such a valuable role in producing research about the sector, and of course in supporting occasions such as this – have published reports pointing to a declining number of donors.
CAF’s latest report finds that, while the overall value of giving is holding up in real terms – in 2023 people donated at least £13bn to charity – fewer people are giving.
Separately, there is evidence suggesting that the top one percent of asset owners and earners in our country give less than their counterparts in equivalent societies, such as New Zealand and Canada. Some have suggested that there is a £5 billion gap between giving in the UK and in those two countries.
Previous research has indicated an overall decline in the value of donations by the top one percent of earners, despite increases in their income. And the latest UK giving report, just mentioned, finds that that some of the least affluent parts of the UK are among the most generous.
In summary, by a number of metrics, it seems likely that while charitable giving is just about holding up, high net worth philanthropy is proving less robust.
The potential of philanthropy
But this challenging context provides for a once-in-several generations opportunity.
For while there may be huge challenge, there is also huge potential, right now, for a new era of philanthropy to tackle our most intractable social challenges. We have the opportunity to resource and re-ignite the potential of our communities, through a renewed collaborative approach between our amazing charitable sector, corporate donors, philanthropists, communities and government.
The potential of philanthropy lies not just in the immediate financial boost it might offer the individual charities.
But in the agility and flexibility, the innovation and creativity it can encourage, inspire and unleash.
I think, as a nation, it is time to re-embrace the long and proud history of philanthropic impact, revive it, unleash it and celebrate it for our times.
I speak from personal experience as to the benefits philanthropy can bring.
I grew up in Liverpool in the 1980s. The city was then in post-industrial decline, and it felt in many ways forgotten and neglected by many. It had, arguably, lost its sense of purpose.
Today my home city is transformed. And that transformation happened through a combination of philanthropic investments, national and local government investment, alongside renewed community action notably in the arts, culture and tourism which acted as catalysts for wider renewal.
Financial and cultural investment in Liverpool in turn led to an expansion in higher education provision, an influx of international students and therefore an increasingly skilled workforce.
Liverpool is now in the process of a next phase of transformation. National non-governmental bodies have moved their HQs to the city, and life science industries are investing. Things are moving and changing thanks to that initial spark provided through philanthropy.
It shows that philanthropy and charity is ever evolving and finding new models, new ways to deliver real and lasting impact. That philanthropy and charity are not just about handouts, but hand-ups and start-ups, with the power to unleash peoples’ and communities’ potential.
To return to arts and culture, a sector that is now highly reliant on major gifts and sponsorships.
The Donmar, for example, lost its council funding in 2022. Now, any work that is not revenue generating must have its costs covered by fundraising. Corporate sponsorship has stepped in and is helping to ensure that the Donmar can continue to invest in its talent development programmes – providing paid traineeships to those underrepresented in the arts industry – and its community work in Camden and Westminster, offering free engagement programmes to over 5,000 young people every year.
Great charitable work, only possible now thanks to philanthropy.
Of course, philanthropy alone cannot make a city or a community, or reverse a social ill. But it can act as a spark that re-ignites hope and confidence and gives a community the confidence to revive itself, and to unleash its potential to adapt to changing economic, political and social circumstances.
The mechanisms for this particular role of philanthropy are varied.
First, philanthropists can do what other funders – notably public sector funders – cannot.
They can take risks and innovate, work out new solutions to deep-rooted problems by trying and testing.
They can support charities’ core costs, helping them develop long-term viability and stability, rather than living only from one grant to the next.
And philanthropists can sow seeds – offering large, one-off donations that allow new charities to get off the ground, or established charities to plan for the long term.
Celebrating philanthropy
So again, whilst there are challenges, there is much to recognise and celebrate.
For example, I am moved to see corporate philanthropy combine with public generosity, community campaigning, media engagement and political interest – as well as support from the Charity Commission – to breathe new life into Zoe’s Place in Liverpool.
The charity provides end of life hospice care to babies and young children, bringing children and their families comfort and relief in incredibly challenging circumstances. It had faced closure in Liverpool, due to the spiralling costs of new accommodation.
Together, campaigners raised £6m in a month before Christmas, allowing the charity to continue.
It was an amazing effort, that would not have been possible without philanthropic contributions.
Similarly, I am deeply impressed with the work of the Moondance Foundation. Founded in 2010 by Diane and Henry Engelhardt, the charity has given away a remarkable £145 million, most of which has gone to support and strengthen communities in Wales, which is the family’s chosen, adoptive home. In December last year, we visited small community organisations in Port Talbot, Swansea, and Bridgend that have all benefited from this extraordinary generosity.
Their example shows that love of a place, responsibility and commitment to a community is a matter of heart, not necessarily heritage.
I would also like to mention here the work of the late Julia Rausing, who sadly passed away last year, leaving an immense legacy of generosity and kindness. She was an example to others, not just in how much she helped give away, but how – her sense of urgency and oversight ensured funds, where needed, were swiftly dispatched and carefully accounted for.
Or the musician Stormzy, who has given back of his wealth and influence to promote education and opportunity among young people.
And I must mention the Commission’s own board member Rory Brooks, who recently donated £2m to the Global Development Institute at The University of Manchester. He will not thank me for including his example here, but in his absence, Rory – if you want to promote philanthropy, you must let us celebrate your own example.
The Commission’s ongoing commitment to promoting philanthropy
I know many in the philanthropy world have been wondering what Orlando’s departure as Chair later this year means for our work in this area.
First, I would like to acknowledge the significant contribution Orlando has made to public discourse on philanthropy during his time in office.
Orlando has used his authority and his voice as Chair of the Charity Commission to ensure philanthropy is seen and understood as one of the solutions to the urgent issues of our day.
And he has made a compelling case for the responsibilities and opportunities the Commission has to convene public debate on this issue.
So I know many in the world of philanthropy and beyond are very sorry to see Orlando move on from the Commission.
But let me make very clear.
The work he began will continue.
I, and the Commission’s Board, are determined to deliver on the commitment made in our corporate strategy to encourage trusteeship and amplify donor and philanthropic confidence through our work.
I am bound by them, not just by professional duty, but by personal conviction. A regulator must enable, encourage, unleash as well as enforce.
I am grateful to Rory Brooks, as I’ve mentioned a remarkable philanthropist in his own right, who as a member of the Commission’s board is spearheading much of this work.
Rory’s diligent commitment over the past two years has borne much fruit.
I am convinced that his quiet powers of persuasion have contributed to a changing public discourse on philanthropy.
A renewed understanding, on all sides of the political divide, that private wealth, voluntarily given, is part of the solution to some of the most entrenched of our social ills.
The new government has demonstrated its interest in philanthropy, particularly in geographical areas that are struggling to attract funding. We heard earlier from Minister Peacock about the government’s commitment to producing a place-based philanthropy strategy, more details of which we expect to hear about over the coming months.
The Commission’s role and work
But for our own part, what are we collectively doing at the Commission to promote philanthropy?
Promoting the UK as a great place to give
First, we have a role in ensuring, and demonstrating, that the UK remains among the best and safest places to give.
We have a robust, long-established regulatory infrastructure, which ensures transparency – not least through the accounting framework – and which gives donors confidence that there is oversight over the funds that charities receive.
That infrastructure stretches beyond the work of the Commission alone – other principal regulators, such as the Department for Culture Media and Sport and the Office for Students, play an important role in regulating vital sub-sectors in the field of culture, arts and heritage, as do auditors and independent examiners working to regulatory requirements.
In that context, the UK is also a centre of excellence for professional services – we boast among the best lawyers, financial advisors and wealth managers in the world.
There is room for more active input from these professionals in promoting philanthropy.
In the legal world, especially, there is an opportunity for those advising on transactions involving significant assets to actively introduce and encourage philanthropic considerations.
But overall, the system we have in place means philanthropists from all over the world, can have confidence in investing their goodwill and generosity into UK based charities – many of which, of course, operate globally.
Supporting charities to improve governance
Second, we help trustees understand their legal duties and sustain and improve their charities’ governance.
Last year, we published guidance supporting trustees to make the right choices on accepting, refusing and returning donations. That guidance reflected the law in being explicit about the starting point that charities should accept donations.
It is for trustees to make decisions as to what is in their charity’s best interests. Sometimes, trustees may well conclude that they should not accept a philanthropist’s support. But we wanted our guidance to be clear that the law assumes donations to charities to be generally a good thing.
We wanted to support trustees to say yes to donations where, having carefully weighed up the relevant factors, it is in their charity’s best interests – even where it might be contentious or controversial for some.
And I think that reminder is salutary at the present time, given the challenging financial context I set out earlier.
The last thing I want to see on my watch at the Commission is charities – including world leading arts and cultural organisations which have long benefited from philanthropic generosity – finding they can no longer operate successfully, because donations are withheld for fear of being rejected.
So I encourage those giving – whether individual philanthropists or corporate donors – to continue to do so even when there may be those who disagree with such donations from a point of personal principle or conviction. It is the benefit of democracy that we can disagree while still each exercising our individual freedoms and still do good for charity, our communities and those most in need.
To help enable this, we hope our guidance will inform a giving culture, but also a receiving culture, that allows for constructive discussion in the best long term interests of charity.
Delivering data-led insights
Thirdly, the Commission maintains, to our knowledge, the most complete and comprehensive charity data set anywhere in the world. Although this presents its own challenges, we’re also keen to recognise the opportunities for collaboration with partner organisations.
Over the last 18 months, Rory has led two summits focusing on the Commission’s data, our ongoing digital projects, and how we plan to help the sector make more informed funding decisions.
I know, for instance, the impact that digitisation of charity accounts will have for those working with charity data and that is why it remains such a priority for us.
These summits give us fascinating insights into how the philanthropy sector uses, and would like to use, charity data. In the near future we will see an early outcome of this work, with new data drawn from charities’ annual returns on the value of their single largest donation received during that year.
This data over time will not just provide useful insights in to trends in philanthropy, but will, I hope, serve as inspiration to existing and potential philanthropists to give with heart and confidence.
Convening role, working with government
A final aspect of the Commission’s role that I am especially keen to promote is that of convenor.
We have a unique ability to help bring together the sector, government, philanthropists and donors as well as experts such as our hosts Beacon and the Charities Aid Foundation to consider, together, how we can encourage those with great wealth to choose the UK as a place to leave a legacy.
It has begun with the work I mentioned on data, but we want to go further and identify other focus areas, bringing together those with the passion and capability to drive progress. Specifically, we are keen to continue to work alongside other players to support government and other policy makers to ensure giving is incentivised and celebrated.
Conclusion
So in conclusion, despite the challenges, I believe we have a generational opportunity to revive and reignite our proud history of philanthropic giving for a modern age.
To build on the many recent examples of joined up action, be it placed-based or issue-based, which sees philanthropy, community, business, media, politicians come together to unleash potential, solve issues or spark renewal.
It is the power of that collective action, that joined-up approach to today’s challenges, that this generation of philanthropists and charities can use to continue to achieve the seemingly impossible, to improve the lives of many and unleash the spark of hope, innovation and opportunity.
As the CEO of the Commission I promise you we will be there beside you, playing our part, enabling you to do the amazing things you do for the benefit of society.
We at the Commission will also help ensure that this growing band of philanthropists feel proud of their achievements, and use our platform to shout about them – encouraging others to follow suit. So to all of you who give, to those professionals that advise and support giving – thank you – never under-estimate the impact you have – and the opportunity you enable.
ATLANTA – Governor Brian P. Kemp today announced his appointment of Josh Lamb as director of the Georgia Emergency Management and Homeland Security Agency (GEMA/HS). Lamb will fill the role following the departure of previous director Chris Stallings.
“I’m honored to welcome Lt. Col. Lamb to GEMA and thank him for stepping into this important leadership role that is critical to the safety and recovery of Georgia’s communities, especially as we continue to rebuild from Hurricane Helene and other storms,” said Governor Brian Kemp. “I know Lt. Col. Lamb is committed to that mission and will provide the leadership necessary to ensure our state is prepared to respond to disaster and proactively keep Georgians safe. Marty, the girls, and I also want to thank Mike Smith for his service during this recent transitional period and for his continued leadership as GEMA Chief of Staff.”
Lieutenant Colonel Josh Lamb serves as the Department of Public Safety’s Assistant Commissioner, overseeing several key areas, including the Office of Professional Standards, the Human Resources Division, the Public Information Office, the Office of Public Safety Support, and Legislative Affairs. He was appointed to his role as Assistant Commissioner on October 1, 2023, having previously served as the Director of Administrative Services.
Lt. Col. Lamb began his law enforcement career in 1996 as a special agent with the Tri-Circuit Drug Task Force after graduating from Georgia Southern University with a bachelor’s degree in justice studies. In 1998, he joined the Georgia State Patrol and graduated from the 74th Trooper School. He has held various positions throughout his career, including corporal at Post 11 Hinesville, sergeant at Post 45 Statesboro, sergeant first class at Post 45 Statesboro, Post 16 Helena, and Post 18 Reidsville. He also dedicated eight years as a State of Georgia SWAT team member. In addition, he served as a lieutenant in the Planning and Research Unit, where he developed departmental policies, organized special events such as the 2018 National College Championship Game and Super Bowl LIII, and worked on legislative matters, including the distracted driving law. His roles have included director of training, SWAT team commander, executive officer to the deputy commissioner, chief of staff, and director of administrative services.
Lt. Col. Lamb earned a master’s degree in public administration from Columbus State University and attended the 259th Session of the FBI National Academy, where he was one of only two individuals from Georgia ever chosen to represent his session as class spokesperson. He also served as an FBI executive fellow and has taught nationally. He graduated from the Georgia Association of Chiefs of Police Chief Executive Training Course. He recently served as the head of delegation for the 31st Georgia Law Enforcement Delegation to Israel.
Lt. Col. Lamb and his wife, Alison, have two daughters, Kenley and Karson.