Posts

Showing posts from October, 2025

Draft bill to overhaul Indian Statistical Institute unveiled

Image
  The Union government has unveiled a new draft bill that aims to bring sweeping changes to the iconic Indian Statistical Institute, a 94-year-old centre that helped design some of India’s earliest data-driven economic policies during the planning era, including the national sample surveys. The Union ministry of statistics and programme implementation uploaded the draft “The Indian Statistical Institute Bill, 2025” on Friday, seeking public consultations on a spate of structural changes and a new executive authority. The ISI was founded in 1931 by PC Mahalanobis, a pioneer statistician and member of the erstwhile planning commission. His “Mahalanobis model”, a statistical framework, was pretty much the bedrock of India’s early industrialisation that stressed heavy state-led capital investments to substitute for imports. The ministry’s pro forma for public comments said the draft bill aimed at fostering “excellence” and “establishing clear institutional structures, streamlining deci...

Generative AI models mostly inaccurate when sourcing statistical data, finds trial

Image
  UK – Most current large language models failed to accurately answer a question focused on UK statistics, in an experiment published by the MRS Census and GeoDemographics Group (CGG). According to the report, most of the AI models got the answer wrong or refused to answer the question. Only one system returned the correct answers (according to ONS) the first time, while another got it right on the second attempt at prompting (with no changes to the prompt). The report found that while the outputs looked coherent in terms of their vocabulary and grammar, the quality of the numbers provided was poor. Additionally, running the same question again was likely to result in a different answer. Report authors Jaan Nellis and Peter Furness conducted the trials to look at what AI tools can do against a particular query about particular data, following a discussion in a CGG meeting earlier this year. Speaking to Research Live, Furness explained: “It’s no longer the preserve of experts t...

Global Statistical Process Control Software Market by Type and Application - Strong 8.5% CAGR Forecast from 2026 to 2033

Image
  New Jersey, US State: "The global Statistical Process Control Software market in the Information Technology and Telecom category is projected to reach USD 2.5 billion by 2031, growing at a CAGR of 8.5% from 2025 to 2031. With rising industrial adoption and continuous innovation in Information Technology and Telecom applications, the market is estimated to hit USD 1.2 billion in 2024, highlighting strong growth potential throughout the forecast period." Statistical Process Control Software Market Size & Forecast 2031 The Statistical Process Control (SPC) Software market is witnessing strong growth as industries embrace data-driven quality management and automation. This software enables real-time monitoring of manufacturing processes, helping organizations reduce variability and enhance product consistency. The increasing adoption of Industry 4.0 technologies, IoT-enabled sensors, and predictive analytics is fueling demand for SPC solutions. Manufacturers are leveraging...

European Statistics Code of Practice: ensuring excellence

Image
  Principle 1: sound methodology Sound methodology is fundamental to the production of credible and reliable statistics. Principle 7 requires that statistical processes are based on internationally recognised scientific standards, best practices, and robust methods. By adhering to sound methodology, statistical authorities ensure that results are accurate, comparable, and consistent over time and across countries, providing a solid foundation for evidence-based policymaking and public debate. Principle 2: appropriate statistical procedures Even the best data and methods can fail if not implemented with proper procedures. Principle 8 emphasises that statistical authorities must apply appropriate procedures throughout the entire production process - from data collection and processing to validation and dissemination. Good procedures ensure efficiency, minimise errors, and maintain the integrity and quality of statistical outputs at every step. Principle 3: non-excessive burden on res...

While the near-term forecast is revised up modestly, global growth remains subdued, as the newly introduced policies slowly come into focus

Image
  The global economy is adjusting to a landscape reshaped by new policy measures. Some extremes of higher tariffs were tempered, thanks to subsequent deals and resets. But the overall environment remains volatile, and temporary factors that supported activity in the first half of 2025—such as front-loading—are fading. As a result, global growth projections in the latest World Economic Outlook (WEO) are revised upward relative to the April 2025 WEO but continue to mark a downward revision relative to the pre-policy-shift forecasts. Global growth is projected to slow from 3.3 percent in 2024 to 3.2 percent in 2025 and 3.1 percent in 2026, with advanced economies growing around 1.5 percent and emerging market and developing economies just above 4 percent. Inflation is projected to continue to decline globally, though with variation across countries: above target in the United States—with risks tilted to the upside—and subdued elsewhere. Risks are tilted to the downside. Prolonged unce...

17 Fake News Statistics For 2025 (Global Insights)

Image
  In 2025, fake news continues to pollute the internet at an alarming scale, with 62% of online content now deemed false. A staggering 86% of global citizens have been exposed to misinformation, while 40% of content shared on social media is fake. In the U.S., 80% of adults have consumed fake news, and 23% admit to sharing false stories, knowingly or not. Trust is eroding: only 32% of Americans trust new media, and fake news costs the global economy $78 billion annually. In this report, I uncover key statistics on fake news from around the world, highlighting its spread, sources, and impact. Let us get into it! Fake News Statistics 2025: Top Picks Approximately 62% of online information could be false. Over 23% of surveyed Americans admitted to sharing a fake news story. 9 out of 10 American adults fact-check their news. Almost 45% of UK adults report coming across a fake news item daily. On average, 40% of content shared on social media platforms...

Is Economics STEM? Understanding the Classification, Scope, and Relevance in Modern Education

Image
  What IsSTEM? Before defining where economics fits, it’s essential to understand what STEM truly represents. STEM stands for  Science, Technology, Engineering, and Mathematics , encompassing disciplines that rely on empirical reasoning, quantitative analysis, and experimental validation. These fields are the foundation of innovation and economic growth, emphasizing technical skills, data-driven thinking, and problem-solving. STEM subjects are characterized by measurable outcomes, the use of mathematical and scientific methods, and an emphasis on evidence-based conclusions. Traditionally, these include physics, biology, chemistry, computer science, mathematics, and engineering—but the modern educational and professional landscape has expanded this list to include fields that apply quantitative tools, like data analytics, econometrics, and even certain branches of psychology or geography. Economics:A Social Science with Quantitative Roots Economics has long been class...

Efficiency, Sustainability Drive Thermo Fisher’s New Microarray Data Analysis Solution

Image
  There are two key things to know about the new Applied Biosystems™ SwiftArrayStudio™ Microarray Analyzer from Thermo Fisher Scientific, said Ravi Gupta, vice president and general manager of Thermo Fisher’s microarray business, in an interview with GEN. It is designed to be faster, more efficient, and cost-effective than existing tools, while addressing current research and anticipating future applications. Robert Balog, PhD, senior director of research and development at Thermo, echoed those points. Although this first version of the analyzer focuses on genotyping and copy number variation, there are developments in the roadmap that will “lead us to other biomarker types and other variant types,” he told GEN.  The conversation took place during this year’s meeting of the American Society of Human Genetics in Boston, MA, where Thermo officially launched the device. The company says that the system integrates four key genotyping processes, allowing re...

Statistical inference and simulations in stochastic modelling

Image
  Simulation methods within statistical data analysis provide invaluable tools to advance knowledge in many areas, such as biomedicine. Umberto Picchini has recently been appointed Full Professor in Mathematical Statistics and will hold an inaugural lecture about his very personal journey through statistical inference and stochastic modelling. Umberto’s main research interest is to construct statistical methods to quantify uncertainty in stochastic models, especially mathematical models aiming at describing natural – physical or biological – processes that are affected by randomness. Most of his current research is about Bayesian inference methodology. Examples of applied work concern the growth of tumours on the skin of mice, single-cell dynamics in systems biology, dynamics in the concentration of glucose and insulin in blood plasma, and neuropathy problems, where neurons in skin die due to diabetes. Deliberately or not, all my applications so far have been biomedical. It really ...

7 newer data science tools you should be using with Python

Image
  Python’s rich ecosystem of data science tools is a big draw for users. The only downside of such a broad and deep collection is that sometimes the best tools can get overlooked. Here’s a rundown of some of the best newer or less-known data science projects available for Python. Some, like Polars, are getting more attention but still deserve wider notice. Others, like ConnectorX, are hidden gems. ConnectorX Most data sits in a database somewhere, but computation typically happens outside of it. Getting data to and from the database for actual work can be a slowdown. ConnectorX loads data from databases into many common data-wrangling tools in Python, and it keeps things fast by minimizing the work required. Most of the data loading can be done in just a couple of lines of Python code and an SQL query. Like Polars (which I’ll discuss shortly), ConnectorX uses a Rust library at its core. This allows for optimizations like being able to load from...

Traumatic Brain Injury Market: Epidemiology, Therapies, Companies, DelveInsight | veriNOS pharmaceuticals GmbH, Cellvation, Abalonex, Hope Biosciences

Image
  Some of the key facts of the Traumatic Brain Injury Market Report:   The global traumatic brain injury (TBI) market is projected to experience steady growth from 2025 to 2034, largely fueled by the development of innovative therapies. According to DelveInsight, there were approximately 4.3 million new TBI cases across the 7MM (United States, EU5, and Japan) in 2024, with numbers expected to rise throughout the forecast period. The United States reported the highest incidence, followed by Japan, Germany, France, and the United Kingdom. In terms of market value, the U.S. led in 2024, with standard care generating around USD 1.12 billion. Current treatment approaches primarily involve off-label use of medications, including antidepressants, antiepileptics, antipsychotics, analgesics, antacids, and other symptom-relief drugs. Notably, 45–85% of TBI patients rely on psychotropic and pain medications, undersc...

Survival Analysis When No One Dies: A Value-Based Approach

Image
  A generalized version of Kaplan-Meier allows to model a continuous value (like money) instead of a binary signal (like survival) Survival analysis is a statistical approach used to answer the question: “How long will something last?” That “something” could range from a patient’s lifespan to the durability of a machine component or the duration of a user’s subscription. One of the most widely used tools in this area is the Kaplan-Meier estimator. Born in the world of biology, Kaplan-Meier made its debut tracking life and death. But like any true celebrity algorithm, it didn’t stay in its lane. These days, it’s showing up in business dashboards, marketing teams, and churn analyses everywhere. But here’s the catch: business isn’t biology. It’s messy, unpredictable, and full of plot twists. This is why there are a couple of issues that make our lives more difficult when we try to use survival analysis in the business world. First of all, we are typically not just inter...

Why AI Still Can’t Replace Analysts: A Predictive Maintenance Example

Image
  New AI models like GPT-4, Claude 3, and Gemini can process and summarize large volumes of unstructured data, generate forecasts, and draw analytical conclusions. Generative AI is modeling proteins, optimizing logistics, and predicting consumer behavior. According to McKinsey, its economic potential could reach up to $4.4 trillion annually. Despite its impressive achievements, AI remains significantly limited in certain areas of analytics. It still cannot make long-term economic forecasts and struggles to predict sudden market shifts. Industrial equipment data analytics is one of the fields where AI still falls short. I have been working in the field of industrial analytics for over 10 years and have witnessed how this sector has undergone transformations and evolved through the introduction of new technologies. Today, artificial intelligence can detect even the slightest signs of malfunction....