segunda-feira, 22 de julho de 2024

When Science becomes Pseudo-Science

 

There is a lot of discussion about science nowadays, but is what the media dubs “scientific” immune to ideology and subjective interests? And to what extent are hegemonic scientific narratives pseudo-scientific? 


Urmie Ray
(published 19th November 2022)

What is science?

There is a lot of confusion about what science is and what it is not. So let us start by discussing the actual definition.

Understanding the reality we live in

To survive, we have to understand the physical reality in which we live. So from the beginning, our species, worldwide, has been trying to approach this challenge. To understand means being able to describe, to explain and to predict. A description tries to answer the question of “how?” and this may take the form of a mathematical formulation, but this is far from necessary. An explanation tends to answer the much more difficult question of “why?” These three aspects are closely related. The quality of predictions depends on the quality of descriptions and explanations. And, conversely, the better the description, the better the explanations and predictions.

The difference between a scientific approach and a religious approach essentially lies in the nature of the explanations. It is likely that our ancestors, or one of their Gods,
did not allow us much foresight, nor thereby any protection against the elements. At the same time, the increasing domestication of fire went hand in hand with a growing understanding of this phenomenon.

Thus, an approach to the world that we perceive through our senses and our minds has long been distinguished from other approaches. The knowledge obtained had to be communicable to others, so that they too could verify it, and in particular, be able to predict as systematically as possible.

I understand that some people are convinced of the power of prayer, but spiritual knowledge, even if it is based on experiences, remains personal and not communicable.

Thus, the approach that has gradually developed into what has been called science since the 19th century is based on two tools: observation and reasoning.

But as our earliest schools of natural philosophy realized long before the Christian era, the question of reliability is central to this approach. To what extent does our interpretation correspond to the given physical reality?

Reproducible observation

Limited observation may not be reliable. For example, just because all observed swans are white, does not mean they are all white. There are black swans. Therefore, observation must be reproducible at will, especially for it to be verifiable.

Consequently, the field of scientific investigation is limited to describable characteristics, that is, those that can be compared. Absolute concepts, such as heat or height, are certainly part of our life experience. But they have no place in science. For there is no way to check how two different individuals perceive them. What can be communicated and agreed upon is that one object is hotter or colder, larger or smaller than another. In other words, only those attributes that can be compared to an external reference can be subject to scientific examination. Therefore, absolute concepts like God are off limits because they cannot be described by comparison.

This is also the problem with Newtonian physics. It is based on a notion of time and space that is absolute. Newton was aware of this. But it was the physicist Ernst Mach who first really understood the meaning of this deficiency at the end of the 19th century. It was based on his work that Albert Einstein developed his theory of relativity.

Let us add an important observation: it is not enough to know the intrinsic properties of a phenomenon. It is also necessary to have an idea of your interactions with the environment. Hence the need to study it both in isolation in a laboratory and in its natural environment, in time and space. Some effects and their meaning may not be immediately visible.

Reasoning


On the other hand, if we remain tied to our limited immediate experience, we will have to collect facts all the time and will not be able to predict. Therefore, science aims to obtain unified descriptions and explanations of disparate phenomena.

Reasoning is what prevents us from being tied to this process of constant trial and error. It must also be communicable. Therefore, it must be based on consensual methods that have varied over time or even from school to school.

Concepts

Finding explanations that unify various phenomena forces us to base our theories on concepts – forces, atoms, genes, etc. – which are not necessarily the same ones we are used to. We have no way of determining whether these are characteristics of our reality or constructions of the human mind to account for experiences common to all of us. They are scientifically indefinable except through the use of other concepts. However, and this is fundamental, the relevance of concepts must be confirmed by effects that can be perceived and described. They should be replaced by more appropriate concepts if observation and reasoning so require.

In summary

Science is the reasoned study based on the reproducible and sufficiently replicated observation of properties describable by comparison of the perceptible world and the perceptible effects attributed to the evolving and changing concepts thus deduced – a study that includes interactions with the natural environment.

We all have a scientific mind. The difference between a scientific approach and a conventional approach is one of degree, not type. It brings precision and systematization where they did not exist. For example, we all have an idea of the difference between fruits and vegetables, based on vague, sometimes cultural, reasons and comparisons. The scientific definition of a fruit is an organ that contains seeds, protects them during their development and helps them to disperse. Therefore, tomatoes and cucumbers are fruits, contrary to popular belief that they are vegetables.

In other words, science clarifies the similarities and differences between comparable objects, taking observation beyond the superficial. In this way, ambiguities are reduced.

It is essential to emphasize that our conclusions cannot be fanciful and inexplicable deductions. They cannot be considered scientific until they are supported by reasoned arguments. Satisfactory results may be pure coincidence. On the other hand, reasoning that is not based on solid empirical evidence is not science.

Mathematics

This is why mathematics is not a science. Science arose from the human ability to make sense of an otherwise chaotic reality by attributing patterns to it. Mathematics was born from the study of patterns that are quantifiable.

In order to master the interrelated notions of quantity and space, our ancestors developed the concepts of integers and geometric objects, excluding the particular nature of the objects in question and maintaining only their quantity or shape. They must have realized that if our only interest is the quantity and not the other qualities of the objects in question, then there is no difference between two fingers and two equal lengths, but that these differ from five fingers. In other words, numbers arose from our recognition of patterns relating to quantity, making our most basic reasoning tool, comparison, as effective as possible.

Without going into more detailed discussion, I will say that mathematics is the logical study of relationships between abstract concepts, based on the notion of numbers.

Weaknesses intrinsic to science

These characteristics of science make our scientific knowledge very precarious.

Simplification and approximation

Even the most holistic approach is a simplification. The human mind is incapable of encompassing the entirety of an unfathomably complex nature. All our deductions, all our observations, all our measurements are only an approximation of reality.

These problems are exacerbated in mathematical theories. A hypothesis must first be expressed in a common language. The process of translation into mathematical symbolism is accompanied by a great loss of information. It eliminates everything that is not quantifiable from the start. Therefore, the further we get from the inanimate world, the less appropriate a mathematical description becomes. Even among quantitative characteristics, a choice must be made. Mathematics can only deal with a very limited number of parameters, and only a very simplified version of their relationships. Thus, a mathematical model reflects reality only very imperfectly.

The rapprochement process goes even further. Although equations have exact solutions in theory, in all but the simplest cases we can only solve them approximately. This is generally the case with differential equations, that is, equations that indicate the evolution of a system in time and space and are therefore the basis of predictions. A whole series of approximations occurs again when translating our mathematical theory back into everyday language, i.e. its application in concrete reality, especially since it is likely to involve non-exact numbers such as √2 or π.

Furthermore, the mathematical part can, as in quantum physics, have more than one scientific interpretation.

In conclusion, the perfect precision inherent to mathematical formalism allows us greater control over certain quantifiable characteristics, but precisely because of this precision, it is very far from reality.

To quote Einstein, “To the extent that the propositions of mathematics refer to reality, they are uncertain; and to the extent that they are precise, they do not refer to reality.”

Unpredictability

It is not surprising that unpredictability follows even in the simplest deterministic theory:

Consider the following example constructed by physicist Max Born. A particle moves without friction along a straight line of length l between two walls. When it reaches the end of the line, it ricochets. Suppose that your initial position is given by the point x0 on the line and your initial velocity is v0 and that the inaccuracy of our initial measurements is Δx0 and Δv0. According to Newton's first law, at an instant t, it must be at the point x = x0 + tv0. However, according to the same law, our prediction of its position at time t will deviate from this value by Δx = Δx 0 + t Δv0. So our error will continue to increase over time. After a critical time tc = l/ Δv0, this deviation will be greater than the length l of the line. In other words, for any time t > tc, we will not be able to predict the position of the particle at all. It can be anywhere on the line.

We can improve our measuring instruments and reduce initial inaccuracy, but we can never completely get rid of it. All we will do is extend the time range in which prediction is possible.

This example concerns a simple and ideal closed system. In the real world, countless factors are involved, worsening unpredictability. Basically, due to inevitable errors, our ability to know what is happening beyond a certain time may be limited to such an extent that no amount of technical progress can be overcome.

In our computerized calculations, small errors can propagate and grow. This is because the coded way in which a computer approaches internal calculations involves a rounding error. The error also occurs when the coded language result is translated back to the printed form on the screen.

Observation in the computer age

Computerization also adds new questions to the act of observation. It has been known since before the Christian era that observation, as a result of a complex collaboration between our senses and our mind, is far from neutral and can be misleading.

Since then, observational instruments have introduced a whole series of new complications, despite the unsuspected possibilities that have opened up. In addition to the introduction of errors, studying events in our four-dimensional space-time from one- or two-dimensional symbolic representations raises the issue of information loss. Most importantly, computers are composed of algorithmic processes represented by 0's and 1's, and are therefore severely limited by oversimplified assumptions. They cannot go beyond that, they cannot infer. So we wonder if they can only detect what fits our preconceptions.

In fact, the problem worsens as the observation process becomes increasingly automated, thus eliminating the human observer: the machine observes and interprets. It's even worse when observation is removed and conclusions are based on simulations rather than real experiments, as is increasingly the case. These problems raise many questions about our knowledge of the microscopic world. It depends entirely on instruments. We have virtually no unfiltered representation to compare the image they give us. Furthermore, in order to observe it, samples are often not only taken from their environment, but also have to be prepared, for example, using a staining technique. An adulteration then occurs.

Generalization

All of this calls into question the process of generalization, that is, deducing principles from data that can only be limited. The problem of generalization is even more serious because the observation can be replicated, but it will never be the same. So how similar must the results be to be accepted as justification for a given conclusion? The question arises all the more because we are not simply trying to deduce the color of swans from repeated observations, but to deduce basic principles from observations of a wide variety of different cases. Too little data can lead to wrong models and therefore wrong predictions.

The greater the number of parameters, the greater the sensitivity of the results to the initial conditions, the less we can expect the results of our experiments to come close. Furthermore, results may depend on the interpretation and protocol applied. Obtaining consistent results can therefore be difficult. So how many times must an experiment be replicated before its results can be accepted?

Basically, the question of when experimental verification can be considered satisfactory has no clear answer. It cannot necessarily be said that it should depend on the success of its applications, as its drawbacks may take some time to be noticed. Even when a hypothesis is developed in the best scientific spirit, serious flaws can remain unidentified for decades, precisely because our observation remains limited, if only for technical reasons.

When is it reasonable to apply a hypothesis, that is, to construct new hypotheses based on it or to use it technologically?

Hypotheses

There can be no science without hypotheses. We must first have established a relationship with the universe before we can even think scientifically. In other words, metaphysics always precedes science. More generally, science remains based on assumptions that are forgotten because they are hidden and have become too familiar. These can strongly influence the theories we develop.

For example, mathematical predictions imply integration. Behind this concept is the assumption of uniformity, according to which processes would remain the same across time and space. This assumption is the basis for all generalizations. For the
Buddha, uniformity was assumed to be very limited. It was Democritus who introduced its most extreme version as a basic scientific principle. Galileo remained cautious. It was reaffirmed first by physicists in the 17th century and then by geologists, for whom the rates of geological processes remained the same over time.

However, due to unpredictability, we have no idea how well the uniformity holds. It is, therefore, better to be cautious with distant phenomena.

Furthermore, uniformity over time has been challenged by geological discoveries since the 1960s that suggest that unique cataclysms have critically altered existing conditions in our planet's history.

The limits of science

For all these reasons, although it is the least fanciful form of knowledge, we cannot know whether science can lead us to truths. Our scientific understanding is constantly being deepened. Therefore, it keeps us away from untruths. In fact, science cannot consciously tell us untruths. At all times it must comply with all known data. We improve our approximations, of course. But, in the infinity of the world, does this bring us closer to any truth?

Doubt is therefore characteristic of a scientific approach. Science challenges conventional wisdom. The importance of doubt has been emphasized by scientific thinkers of all times and traditions. Theories should not be rejected, but their acceptance should not be passive.

To proceed scientifically is to recognize that science is a “philosophy of nature”, even if it is different from other philosophies in that it “questions nature itself to obtain answers to what nature is”. To proceed scientifically is to hope that our scientific thoughts are in harmony with nature, as otherwise we would be incompatible with the given conditions of life, but it is also to recognize that science is far from being objective. It always presupposes the existence of man, and we must realize that we are not mere observers, but also actors on the stage of life.

From science to dogma

Staying on the scientific path requires caution. It is easy to dodge this.

However, sincere errors should not be confused with dogmatism. It is through error that we advance, all the more so because in each era, in each culture, science is influenced by existing thoughts and observation techniques. Thus, it is an anachronistic interpretation of Newtonian physics to apply our current understanding to it and consider it false. It's still satisfactory enough for some common phenomena, as long as the speeds involved are well below the speed of light.

That said, the nature of science has been highly appreciated for millennia, for example by certain schools of thought in ancient India. It was also the subject of heated discussions at the turn of the 20th century, when issues of positivism became clear. Thus, the deformation of modern science into dogma is facilitated by its intrinsic weaknesses, but to understand it, it must be placed in the economic context.

In the 19th century, market capitalism was transformed into financial capitalism. The profit-maximizing perspective that was gradually established required ceaseless material growth and thus increasingly efficient production.

As a result, technology must rely on advanced research to increase efficiency, resulting in constant and increasing changes to our environment. It is becoming less and less suitable for human life. The fact that something appears temporarily viable does not guarantee its compatibility with maintaining the conditions necessary for human life in the medium and long term, or even in the short term: health and environmental problems quickly followed and continued to increase. A stage was then reached where, in order to stay the course, research increasingly lost its scientific nature and began to betray science itself.

In other words, science has turned into pseudoscience. It is a set of principles that claim to be science, but that do not have the characteristics of science, in particular that they are not based on reasoning based on observation, reproduced and reproducible. It is, therefore, a belief.

Current research may too often be described as such. The extent of abuses is difficult to measure because a basic condition – transparency – without which there can be no science, as conclusions remain uncontrollable, is now commonly ignored, under the pretext of competition or state secrecy.

According to Richard Horton, editor of the prestigious Lancet, “much of the scientific literature, perhaps half, may simply be wrong. Plagued by small-sample studies, minuscule effects, invalid exploratory analyses, and blatant conflicts of interest, as well as an obsession with following fads of dubious importance, science has taken a dark turn.”

Unfounded conclusions

Let's look at two examples.

1) Based on purely mathematical and theoretical assumptions, it was extrapolated from experiments with electromagnetic radiation (EMR) “in the visible, ultraviolet and X-ray bands”, that is, with “frequencies above the lower limit of the infrared”, that all
EMR is quantified, that is, it consists of photons. It was only in 2015 that this claim was experimentally verified and found to be wrong for EMR below the lower infrared limit, which includes all EMR from our antennas – one reason why this radiation is harmful to human health.

2) The viral thesis is also an hypothesis. Unlike the case of
EMR quantification from antennas, it has not been proven to be false. But no particle has ever been observed to first be in the air, then enter the body, and become the source of a disease. Therefore, the virus remains a concept. Is this a useful hypothesis? Perhaps unicorns or ghosts are useful hypotheses to explain certain phenomena. But a scientific conclusion must be based on reproducible observation, and this is certainly not the case. And so we can develop the concept of an antivirus, like a vaccine. But you cannot materially manufacture a vaccine against something that has not been proven to exist. And you can't play politics on assumptions that remain unchecked to this day. The debate over the relevance of a particular hypothesis must remain internal to the scientific world, and this is how we evolve in our understanding.

Financial temptations

Research is today fully embedded in market capitalism, which it made possible and continues to make possible. Financial gain has become a primary motive in a culture where more and more researchers are creating private companies themselves to financially exploit their results.

As a result of deliberate policies to shift research funding from public to private bodies, many members of the hierarchy of the new Church of Scientism are personal beneficiaries of the greatness of the various interest groups. Corruption is now endemic and conflicts of interest seriously undermine research activities. Only conflicts directly related to a particular work should be disclosed, i.e. any direct funding that could influence its conclusions. This obligation is easily circumvented: favors can take many forms, from appointments as consultants to membership on company boards. When the generous donors to universities, research labs and scientific societies include some of the most powerful multinational conglomerates, can any work done within their walls be truly selfless?

From knowledge to production

This corruption goes hand in hand with the transformation of the objectives of science since the turn of the 20th century to meet the demands of financial capitalism: from understanding the objective became to produce. The early 20th century saw the emergence of the researcher-technologist, first in chemistry, then in physics, and now in biology.

This subjugation of research to the economic ideal is maintained in particular by a culture of awards. This was initiated by a leading industrialist in the emerging military-industrial complex, Alfred Nobel, precisely at a time when the control of research became essential. This
culture helps bring forth individuals and subjects that are dedicated to this ideal. It isn't easy to recognize the strong subjectivity underlying the decision-making process here because, unlike science, the new creed of pseudoscience professes a valueless objectivity.

This does not mean that exceptional works are never properly recognized. But it is preferable that they contribute to the maintenance of economic objectives. Einstein only won the Nobel Prize for his work on photoelectric effects.

But the awards have contributed to the rise of pseudoscience, since this is what can sustain production.

For example, one of the first Nobel Prizes (in chemistry) was awarded to Fritz Haber for the synthesis of ammonia. However, the method of producing artificial molecules does not reproduce the natural process and, therefore, their geometry differs from their natural counterparts. The correct scientific approach would therefore have been to study its impact on the environment and human health.

Marie Curie received the prize twice, so it would be normal to believe based on the prize's criteria that her works are more important than Einstein's. They are certainly more important from a profit perspective. Her objective was circular: to study the properties of radioactivity in order to constantly increase production. All the tragedies associated with this work were developing radiotherapy as a treatment for cancer. Here again we have a circular pattern: the application of radiation to alleviate a disease that was relatively rare compared to other diseases and whose prevalence helped increase.

Increasingly, research has become a matter of colossal machines requiring colossal funding in a few colossal locations. Thus, it is based on unique experiences that cannot be replicated at will, not least due to the necessary infrastructure. This over-reliance on technology makes us forget that artificially created processes in laboratories may very well not correspond to their real-life equivalents.

For example, in the 1950s it was discovered under laboratory conditions that organic matter could emerge from what could roughly be described as a methane soup. Because of this success, it has been forgotten that this does not imply that this is how it happened. And in fact, the first experimental study to reconstruct this early atmosphere based on real empirical evidence, carried out in 2011, indicates that, on the contrary, it may not have been as poor in oxygen as previously thought.

An inversion of the relationship between mathematics and science

The gradual takeover of science by pseudoscience is reflected in the gradual inversion of the relationship between science and mathematics.

With the growing importance of industrial production, mathematics acquired greater primacy within science because it is through the measurable that science can be converted into technology.

The first big step was the birth of computer science due to the demands of technology when physics and mathematics and technology were amalgamated into a single field. This synthesis was certainly very constructive, but as it was carried out from a profit perspective, it also helped to maintain its maximization. In this process, mathematics quietly took the driver's seat.

Mathematical applications depend on our purpose and are not limited by the need to conform to reality, unlike science. Thus, the cession of leadership to mathematics largely contributed to the emergence of pseudo-science. This inversion is also the consecration of the materialist perspective, since mathematics cannot take into account the factor of life.

The second step was the creation of bioengineering, one of the fastest growing sectors. Life came to be seen as a huge computer whose underlying programs can be transformed at will. Thus, the mechanistic view of nature was adapted to the new technological phase we have entered.

Mathematics via computer science is now taking its process to the final stage, where intelligence is reduced to a measurable quantity and knowledge to information flows, that is, to artificial intelligence, which in the end should lead us to transhumanism – total fusion of life with the machine in the driver's seat.

There is, however, a basic error with a machine-made virtuality that denies our given reality. Since “realities” are not “ghosts,” writer Charles Dickens warned that there was “a greater danger of them swallowing us up” sooner or later.

Science and Future

Therefore, the problem we face is the deformation of science into a pseudo-science responsible for man-made dangers. On the other hand, despite all its weaknesses, an approach based on observation and reason is certainly the most appropriate for the study of perceptible reality. To reject science is to renounce the wonderful possibility of unraveling some of nature's mysteries, even if only superficially, even if we always end up discovering that our previous conclusions were not entirely correct. To reject science is to reject our main survival tool.

Education

It is therefore essential to first distinguish science from its deformation. To do this, it is necessary to develop some appreciation not only of the technique of science, but also of its nature. Science is not a matter for experts. The amateur must claim his right not only to understand, but also to judge according to his own lights. Everyone is able to understand the ideas behind the technical part. The best way to learn how to do this is to read the works of pioneering scientific minds. Who better to explain the how and why of the ideas they helped develop.

However, the only real way to dispel the confusion between science and pseudo-science is to ensure that the education of future generations feeds our innate scientific intuition. Assimilating the spirit of science is learning to think for yourself based not on dogma, but on an adequate assessment of the range of information available. This requires that
instructions on technique be placed in the context of a discussion about the nature of science.

There are many ways to do this and not all are suitable for every student. This is why pluralism is essential in the type of education offered, both at school and at university.

Reduce the scope of harmful research

The ethical question

So far, the debate has focused on ethical issues. However, the problems have not been resolved. On the contrary, they are getting worse.

Ethics certainly influences science. Basing science on values that lead to a more peaceful future may seem like the best way forward. But is this really the case? What should these values be?

Ethical debates remain ineffective. On the other hand, restricting research within any ethical framework is harmful to science. Setting limits on the human mind erodes the creative dynamism essential to civilizations. Creativity takes unpredictable forms, therefore, it must be given free rein.

The debate should be moved to a less controversial level

Man-made dangers are the result of clearly unscientific research. The debate must therefore be about the scientific nature of the research. It is true that science cannot be expected to be defined precisely or for there to be sufficient consensus. However, it is possible to clearly identify what is not science. There is research that is contradicted by studies with a solid empirical basis, research based on observations that cannot be reproduced at will, or whose conclusions are based on reasoning that does not correlate with the data provided. In particular, this would greatly reduce or even eliminate controversial experiments.

For example, the field of medicine continues to be based on animal experimentation, despite the fact that this has been repeatedly denounced on ethical grounds for over a century. Today, as in the past, the answer is that human lives are saved as a result. However, due to biological dissimilarities between other species and ourselves, extrapolation to man is rarely justified and may even be harmful. In other words, these experiments are superfluous and science is certainly not about doing experiments just for the sake of doing them.

At the same time, transparency of all research must be legally required. This is far from the case today.

Research and money

That said, the raison d'être of pseudo-science is the pursuit of profit. The link between money and research must therefore be broken. This means, of course, that anyone with material interests must be prevented from exerting undue influence over the research. Anonymity would prevent the donor from choosing the recipient. For the purpose of science it was transformed into a means of making profit. But it goes further: scientific activity itself has been transformed into a wealth-generating activity, thanks to the development of the notion of intellectual property and patents. As a result, the value of scientific work now depends on the amount of money it generates. It's not that science is above or below money, it's simply not related to it. Thus, the choice of an inappropriate criterion for science has contributed to its distortion. Therefore, it would be useful to discuss the abolition of patent laws and how to achieve this.

Excessive specialization

Overspecialization not only impedes the correct study of the most fundamental and pressing issues, as they often span multiple fields, but also the identification of these issues.

One way to remedy this problem is to transform universities into small academic communities without any subject barriers and to make academic studies less specialized. Naturally, the number of years of study will increase. But the current speed of training is the result of the mentality of recent centuries. It has lost its relevance with a longer useful life and technologies that free us from various tasks, mechanizing them.

What science for the future?

As far as science itself is concerned, the question is what form it should take. The goal should be to reduce your weaknesses.

Maurits Escher's images show how little we are capable of understanding the complexity of reality. Even in the case of two intertwined patterns, the human brain can only observe one at a time. In other words, any light from a perspective only illuminates certain aspects. These aspects may even look different from different perspectives.

But every form of science is based on assumptions. Therefore, each of them may miss critical aspects. Science must therefore be restored in all its diversity.

Starting with a synthesis of the different forms that science has taken over the course of history could prove useful and lead to radically new ways of thinking. It would be foolish to jointly reject the vast reservoir of knowledge already developed in different cultures at different times and grope in the dark.

The relevance of ancient approaches in the modern context is underlined by the example of mathematician Srinivas Ramanujan: the results he obtained following a tradition dating back to the Vedic era have proven to be essential in the most sophisticated modern physics.

Naturally, proven methods should not be abandoned, but supplemented by others, taking into account the many changes in our perception of reality brought about by science itself.

In short, as in education, it is only through the return of true pluralism that we can attempt to overcome some of the gaps in human understanding.

Scientific applications

Only after a broad and deep theoretical understanding can we begin to think about technological applications. As Ralph and Mildred Buchsbaum proposed half a century ago, “the burden of proof…of the absence of significant harm to man” should be legally “placed on the man who wants to bring about some change.” Today, proof of actual harm must be provided by victims. But it is unrealistic to rely on science to identify the exact cause of harm. In fact, science is generally unable to untangle the increasingly complex web of causes and pinpoint a culprit. Or, when it does, it is a long process. Meanwhile, damage is being done, sometimes irreversibly. Too often, there are still reasonable doubts. This puts people at the mercy of legal judgments based on technicalities and the opinions of those who make them.

Once the public has given its consent to a certain type of application, even more careful experiments must be carried out to ensure that the side effects do not have a negative impact on us. In other words, we need to give ourselves time before introducing new elements into nature; Only carefully conducted experiments in natural environments and over sufficiently long periods of time can help us distinguish between applications whose main problem is excessive use and which can therefore be used within certain limits, and applications that present other problems.

Conclusion

Let's return to the initial question: what is science if it is in constant flux?

It is the human mind's disorderly but heroic attempt to understand the workings of the universe by persisting in the face of insurmountable obstacles, in the face of elusive understanding, despite countless failures and errors. These errors, in turn, give rise to new questions that must be answered. Scientific knowledge is the closest thing to certainty, but it is unable to offer certainty because certainty is incompatible with our human condition.

Despite the dominance of pseudo-science, real science has continued to make its way. During the last two centuries, the science we have developed has undermined the belief in a manifest reality of material substances interacting according to mechanically rigid rules. From a reality of isolated substances, each thing began to be seen as part of a whole. This whole cannot be reduced to the sum of its parts. On the other hand, no part can be explained independently of the whole. And yet, each individual part has its own meaning and reflects the whole in different ways. In short, our scientific understanding increasingly takes into account the complexity of our reality.

It is up to us to restore science to its rightful place in a supportive environment where scientists are finally free to focus on constructive topics of their choosing. Nowadays, many have to waste their talents and efforts to combat the lies that are crafted and propagated in the name of science.

In this context, industrial activities will also not necessarily be harmful, but beneficial to humanity.


Dr. Andrew Kaufman & Urmie Ray – Undermining the viral theory


Source: https://strategika.fr/2022/09/25/quand-la-science-devient-pseudo-science-urmi-ray/

sexta-feira, 19 de julho de 2024

Evidence for the use by Israel of a neutron uranium warhead in Palestine and Lebanon


 

Christopher Busby
Green Audit, Dec 8 th 2023

Abstract

Since 2003 measurements made by Green Audit in Fallujah, Iraq 2003, Lebanon 2006 and Gaza 2008 have provided unequivocal evidence of Uranium residues which show anomalous Uranium U-238/U235 isotope signature ratios. Results of measurements by independent laboratories in Europe and the UK, using different techniques, revealed the presence of enriched Uranium in biological materials and environmental samples including soil, bomb craters and air (as recorded in vehicle air filter dust). From recent 2021 results published in the Journal Nature, enrichment levels in background samples from Gaza show that enrichment ratios have been increasing markedly since 2008. Since enriched Uranium is an anthropogenic substance which does not exist in nature, the question arises as to the source, in the weapons employed by the USA (Fallujah) and Israel (Lebanon, Gaza). It is proposed that the only logical answer is that a Uranium-based weapon exists that produces U-235 by neutron activation and has been deployed. Such a weapon must be some kind of neutron bomb.

1. Background

The issue of the health effects of Depleted Uranium munitions continues to be an area of significant scientific differences of opinion since the weapons began to be employed by the USA in Iraq in 1991, and later in the Balkans. The authorities in the West, employing the risk model of the International Commission on Radiological Protection (ICRP) moved to deny the health effects which quickly emerged in Iraqi populations in the 1990s, including cancer increases and birth defects, by arguing that owing to its very low radioactivity, DU could not be considered as a cause [1, 2, 3, 4]. However, similar remarkable increases in cancer were reported from the Balkans (Serbia) with reports of cancer increases in UN Italian and Portuguese KFOR peacekeeping soldiers stations on areas of Kosovo where DU had been conceded by the USA to have been deployed. A survey by Green Audit of Kosovo in 2001 revealed the existence of DU particles in Djakove, Kosovo, and samples were analysed in the UK [5]. The isotope ratio, Uranium 238/Uranium-235, which in the natural soils is 137.88, showed Depleted ratios as high as 300. Following complaints of US and UK Gulf War veterans of a range of conditions (termed Gulf War Syndrome) which they blamed on their exposure to DU dust, created when the penetrator weapons struck their target and burned significant scientific interest turned to the issue. This contribution will not rehearse the arguments about DU and health. It is concerned with a different investigation.

2. Lebanon 2006

In 2006, Israel bombed the Lebanon. Green Audit was contacted by Prof Ali Al Khobeisi, a physicist and member of the Lebanese Academy of Sciences. He was aware that Busby was a member of the UK Ministry of Defence Depleted Uranium Oversight Board (DUOB) and part-author of the DUOB Minority Report [6]. He was concerned about gamma radiation measurements he had made of a weapon crater in Khiam, Lebanon, which revealed an approximate 8-fold excess in gamma radiation dose rate, relative to background, at the crater.

Busby asked a colleague (Dai Williams) to fly to the Lebanon, and obtain samples from the crater soil and possibly from an ambulance operating in Beirut, where some very large bombs had been dropped to destroy a command bunker. Several samples were brought back and analysed, using both alpha spectrometry in one laboratory and Inductively Coupled Plasma Mass Spectrometry (ICPMS) in a separate one. Later Prof Khobeisi came to the UK with further samples to discuss the issue at the Green Audit laboratory in Aberystwyth. The presence of Enriched Uranium in the Lebanon in 2006 became a Media issue when it was written up by Robert Fisk in the Independent “Israel’s secret Uranium bomb” [7]. The UN sent a team to the Lebanon to take samples and Green Audit asked Mr Williams to also take samples so that split samples could be analysed. The Green Audit samples continued to show enriched Uranium, but the UN samples were said to show natural ratios. The issue has never been resolved. The Green Audit results are summarised in Table 1. 

3. Gaza 2008 

The issue of the enriched Uranium in the Lebanon had, by the time of the 2008 bombing of Gaza, been widely covered by media. In 2009, Green Audit was contacted by doctors in Gaza who were concerned about very unusual weapon effects seen in children and adults exposed to the flash and shock from Israeli bombs and missiles. Busby arranged to visit Egypt to obtain samples, again, vehicle filter samples. However, despite a cover letter from the President of International Doctors for the Environment in Belgium, the UK Foreign Office would not provide permission. Samples were nevertheless smuggled out of Gaza to the UK via the Irish Republic, and measurements made of the Uranium enrichment ratio. As in the Lebanon, results showed presence of enriched Uranium (see Table 1). 

4. Fallujah Iraq, 2003 

In 2010, a series of epidemiological and environmental studies were carried out to investigate reports of high levels of cancer and birth defects being reported by doctors in Fallujah, where there had been very concentrated bombardment of the town by the US forces in 2003 [8,9,10]. Following a questionnaire epidemiology study [10] which found profoundly alarming levels of genetic damage (cancer, birth defects, sex ratio perturbation) samples of hair from the parents of the birth defect children were obtained and analysed for 52 elements using ICPMS. Results showed significantly raised levels of Uranium (relative to published and control values) but more important, indicated enriched Uranium signatures. The authors pointed out this anomalous finding and speculated that some new weapons had been deployed in the Fallujah bombardment [9]. 

5. Gaza 2021 

An important study of samples of soil, sand, recycled building material from Gaza and sand from Sinai was published in 2021 [11]. Results indicated enriched Uranium in all the Gaza samples except those from Sinai. The method employed was gamma spectrometry which is arguably more accurate than alpha spectrometry of ICPMS since it is a whole specimen method and does not rely on pre-measurement chemistry which is known to lose up to 40% of the Uranium in the sample. The degree of enrichment found by the authors was very much greater than found in Gaza after the 2008 bombing. Gaza had also been bombed by Israel in 2014.

Table 1. Summary of Uranium Enrichment (atom Ratio) in all samples from Iraq and Palestine 2003-2021.

Note: Isotope Ratio U238/U235 calculated from activity ratio reported assuming natural ratio in activity is 21.5

6. Natural Uranium in the environment.

Uranium in the environment, as mined, has three isotopes, U-238, U-235 and U-234. Once the importance for A-Bomb development of the fissile isotope U-235 was realised, various methods were employed from 1943 to construct massive projects to separate the U-235 from the natural Uranium and employed in the Atomic bomb and used in the war with Japan in 1945 at Hiroshima. In passing it is of interest that in separating the U-235 using centrifuges, or methods relying on mass differences, the resulting enriched Uranium also had large quantities of the even lighter U-234, which is a decay product of U-238 (via two short lived isotopes, Thorium-234 and Protoactinium-234m) present in natural Uranium in activity equilibrium with the U-238. Thus, every decay of natural Uranium has the same activity of U-238 and U-234. After separation of the U-235, the resulting Uranium is termed Depleted Uranium, or DU. It is, of course radioactive, and so must be disposed of by law as a radioactive substance. Its activity is considered low, 12.4 million decays per second (Becquerel) per kilogram, and since these decays are alpha particle decays which cannot penetrate skin, it only represents a health hazard if internalised by ingestion or inhalation.

It must be stressed: if enriched Uranium is found in environmental samples, the origin has to be from an enrichment plant or some anthropogenic process. It is not natural. Since enriched Uranium has been turning up in the Middle East, and increasingly so in Gaza, the question arises, where is it from.

7. Enriched Uranium in Gaza, the Lebanon and Iraq.

Reports that say that something hasn't happened are always interesting to me, because as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns—the ones we don't know we don't know. And if one looks throughout the history of our country and other free countries, it is the latter category that tends to be the difficult ones.

Donald Rumsfeld, Pentagon News Briefing, Feb 2002

There are many questions relating to the findings of Enriched Uranium in Lebanon, Gaza and Fallujah. But logic points to only one overall conclusion, in Rumsfeld terms, a thing we know we know. This is that U-235 is in clear and statistically significant excess: it is present in the samples. In the case of the recent 2021 Gaza study, it is definitely there in 55 of 69 samples.

The only samples where it is not clearly present are the 14 samples from Sinai, that is, not from Gaza, and which therefore could be employed as a control group.

WE also know that U-235 in excess can only come from anthropogenic sources:

• It can be separated from the Uranium ore by refining and employing centrifuges or
other technical means to remove it on the basis of its slightly lower atomic mass.

• It can be produced by neutron activation. That is the irradiation of U-234 with
neutrons. Such a production occurs in a nuclear explosion, or in a nuclear reactor.

• According to the late Prof Del Guidice (see below [16]) it can also be produced by the irradiation of U-238 with neutrons, leading to the formation of U-239 which may lose an alpha particle and produce U-235. The normal decay product of U-239 is Plutonium-239

Following logical questions, if U-235 is found in the three locations in the Middle East shown in Table 1, there are only two possibilities:

1. The Israelis dropped U-235 which they had produced in Israel or purchased in bombs or other Uranium weapons.

2. The Israelis employed a weapon which contained U-238 but which produced U-235. Such a weapon must produce neutrons, and would be designated a neutron bomb.

The first of these possibilities can be discarded: to drop enriched Uranium on your enemy is absurd. It is expensive. It is like killing your enemy by dropping diamonds. Enriched Uranium reportedly was valued at £250,000 a kilogram in the 1990s [17]. This leaves result (2), which is that the source of the U-235 is a neutron-producing bomb.

8. Neutron Bomb

The known knowns

This contribution will not provide a review of what is known about neutron bombs. The Rumsfeld known known here is that they were apparently invented by Sam Cohen who worked for the Rand Corporation and argued that the employment of an Enhanced Radiation Weapon which irradiated local populations with neutrons was an efficient war method since it killed enemy personnel who were sheltering behind concrete walls or in bunkers without destroying the buildings or infrastructure providing shelter. Cohen argued for the use of neutron bombs in Vietnam, but was sacked by the Rand Corporation which employed him. Later in the Reagan period, Cohen returned to work under Reagan and the USA began to manufacture neutron warheads for anti-ballistic missile systems. By the 1990s it was generally conceded that all the major nuclear States had neutron bombs in their stockpiles.

This included Israel which, according to whistle-blowers like Mordecai Vanunu had tested a neutron bomb in South Africa [18]. However, the design of the Cohen type warhead was fairly conventional. It was merely a conventional U-235 warhead of low yield without a Uranium-238 DU tamper case to reflect the initial neutron burst back into the system and thus increase the yield. It contained Tritium and Deuterium and relied upon a fusion reaction to create Helium4 and release neutrons. In this case, the yield (kT TNT) is not the object. The creation of lethal neutron exposures is what is aimed for. In passing, neutrons have between 10-fold and 100-fold biological effectiveness, and so would also be a perfect weapon for those wishing to destroy the genetic integrity, fertility, and longevity (cancer etc) of the enemy civilian population.

The known unknowns: the Cold Fusion warhead—Red Mercury

We know that there are things we know that we don’t know. But there are pieces of evidence that suggest strongly that there is a new weapon that involves Uranium, which creates enriched Uranium and which employs cold fusion was invented by the Soviet Union at some time in the 1980s and produced in the 1990s. In the 1990s there were widely discussed reports and statements about a new radioactive weapon based on a material called Red Mercury. The UK Channel 4 produced a documentary about this weapon in which they consulted with Dr Frank Barnaby to see if there could be some explanation. Did Red Mercury exist? What was it? Could it form the basis for a bomb (which one Russian expert told them was the size of a ball point pen cap but could destroy Moscow) [17]. Apparently Red Mercury was a chemical compound, Mercury Antimony Oxide (Hg 2 Sb2 O7 ) that had been placed in a reactor for some weeks, was radioactive, and potentially could explode with the level of energy that could destroy Moscow and so forth. Later, after this documentary, the idea that such a weapon was likely or possible was dismissed by the scientific community.

And rightly so. Nevertheless, Sam Cohen stated that he believed it possible. But there were some interesting pieces of information about Red Mercury that emerged for those who knew what was important.

Interestingly, Cohen referred to a “ballotechnic” mechanism for Red Mercury. This is an explosive that releases energy on impact purely as a result of impact pressure.

These included:

• The material was being sold at £250,000 a kilogram, and the Soviets were selling it, there were orders and other documents seen by Channel 4.
• The material was very dense, the density was 20g/cc.
• The Soviet code word for Enriched Uranium in the 1940s was “Red Mercury”.
• Cohen, who would know, referred to an impact initiation weapon, a “ballotechnic”.

It is not difficult to conclude from this that Red Mercury was, in fact, some kind of Uranium which had been processed in some way. Mercury has a density of 13.5, Antimony 6.7 and it is hard to see how a compound of the two could have a density of 20 after irradiation with neutrons for 3 weeks. It is chemically impossible. Uranium does have a density of around 20.

In which case, why was this Red Mercury idea started? It is easy to speculate that it was a cover for a real weapon, a novel and very small nuclear weapon based on what was already known, indeed what Sam Cohen is unlikely not to have known.

The Cold Fusion Neutron Bomb

Fusion of Tritium and Deuterium to give Helium-4, a neutron and huge amounts of energy has been and remains the Holy Grail of Physics. The energy of fusion produces enormous temperatures, no nuclear waste in the form of fission products like Strontium-90 and Caesium-137 and the reaction is the one which powers the Sun. But the temperatures involved are so great that the problem is how to constrain the reaction. Normal materials will vapourise and so the reaction must either be very short and/or constrained in a magnetic field. In the 1980s Fleischmann at Southampton (UK) and Pons in USA claimed to have brought about fusion by electrolysing Deuterium Oxide with Palladium electrodes [19]. 

The experiment was repeated by the Harwell laboratory in Oxford (the UK government Atomic Energy Authority laboratory) and reported to not occur. Since then, the question of cold fusion has continued to exercise the scientific community [19].

Shortly after the Green Audit report on Enriched Uranium in Lebanon, the author (Busby) was contacted by a Italian physicist, Emilio Della Guidice [16] who travelled to London to discuss his ideas about the finding.

The bomb, he suggested, is a version of cold fusion discovered by Fleischmann. This author (Busby) worked with Fleischmann in 1979 on the Raman spectrum of adsorbed water. Del Guidice said that Uranium dissolves hydrogen (or Deuterium or Tritium) which then becomes trapped in the matrix. This is plausible as the Uranium atom (mass 238) is very large compared with hydrogen (mass 1) so there is a lot of space in the crystal, also a lot of electrons in the outer shell of the Uranium (92, Hydrogen = 1). Del Guidice believed that if the Uranium laced with hydrogen hit a target and deformed whilst also burning at a very high temperature, there would be fusion. In this case (he said) the 14MeV neutron produced would knock the U-238 up to a metastable U-239 and this would decay to U-235 with emission of an alpha particle. The reaction he referred to is

T2 + D2 ===>n(0) + He4 + 14MeV

At the time, I believed this. But later thought some more about it. First, U239 decays to Plutonium-239 and not to Uranium-235. Plutonium-239 decays to U-235 with an alpha decay, but with a long half life. But what is certainly in the Uranium is U-234. This would take up a neutron to give U-235. This reaction is a much more likely source of U-235. The second problem with the Del Guidice bomb, is that hydrogen does not dissolve in Uranium. It may have been that (as a physicist, and an Italian) Del Guidice did not use the correct English.

If you heat Uranium metal to 300 degrees it reacts with hydrogen to give Uranium hydride UH3. Presumably then also Deuterium and Tritium. These are molecular species, not as Del Guidice told me, a solution or an intersticial affair. When the system gets above about 700 degrees the hydrides decompose back to Uranium and hydrogen. This is the basis for a nuclear power system which cannot melt down as the neutron moderator, hydrogen, reversibly leaves the Uranium and stops the reactor. Neutrons are stopped by low atomic number elements, Lithium, Beryllium, Boron ,Hydrogen. They pass through high atomic number elements (e.g. in concrete). They are stopped ballistically not ionically as they carry no charge. Their relative biological effectiveness (ionisation) results from the kinetic energy they impart to hydrogen in water. As already stated, it is about 100 (alpha is 20).

So a plausible method is as follows: a mix of Depleted Uranium is made with varying quantities of UT3 and UD3. When these are heated up, by explosive or just by impact they do fusion, as Del Guidice believed, producing a massive neutron release of 14MeV and to a lesser extent 3.5MeV plus an alpha particle. There is no tamper, as with thermonuclear, so the neutrons are not reflected back into the bomb but are allowed to escape. The device is very small and low yield. It is reported that countries like Israel and USA had neutron land mines and shells. The key is the very low yield explosion (tons of TNT).

If the neutron activation of U-238 is the case, or partly the case, then there will be Plutonium-239. In the Depleted Uranium Oversight Board, it was reported that Pu-239 was measured in DU residues, also U236, but this was explained away as due to contamination in the source material. However, no Plutonium was found in the Lebanon samples [13].

This weapon is arguably the fabled Red Mercury. It would be small, there is no initiator as it is an impact weapon. Though versions with initiators might also exist. It would be produced from Uranium reacted with Tritium and Deuterium in some ratio, and possibly alloying substance like Niobium (found in excess by Green Audit in the Gaza samples). It could be tunable, the proportion of UT3 and UD3 in the mix is decided in the manufacturing process.

Unknown Unknowns

For obvious reasons, little can be listed here. However, the Del Guidice outline neutron bomb may be only one version of the system. There may be other initiator processes. It is pointless speculating further here. It is hoped that someone in the military will provide or be forced to provide further details.

9. How might this issue be investigated?

Of course, there will be activation products in local materials, soil concrete etc. I obtained some concrete from the Baghdad airport after the US killed the Republican Guard who were defending it. However, there was no money to measure anything, and by the time the material came to England any excess induced radiation will have decayed. I was told by Iraqis that there was a big flash and they were all found dead in their bunkers the next day. The US would not let IAEA in to measure anything for 6 months and fenced the site off and removed the debris into the desert. Note that Co-60 is an activation product which would have been in the steel. Metal guns, metal shielding, reinforcing rods etc. There would be residual gamma radiation at the impact site. There could be residual Tritiated water and Carbon-14 residual contamination.

The late Prof Ali Khobeisi measured residual gamma in the Khiam Lebanon crater in 2006, radiation which disappeared over 6 weeks. About 20 times background. That is a reasonable decay period for the immediate neutron activation products in soil (except Co-60 in steel). Table 2 gives a list of methods that can be employed to identify the use of a neutron bomb.

Table 2. What methods can be employed to investigate the use of a neutron weapon?

10. Health effects

This contribution would not be complete without touching on the health effects seen in populations where these weapons were deployed. If the weapons caused exposures to (a) neutrons and (b) Uranium aerosol particles, then it would be expected that there would be genetic effects and immediate effects involving severe burns or even vapourised limbs of humans. No reports of such effects have been published for Lebanon as far this author can find. For Fallujah the genetic effects found were profound, and included congenital malformations, high rates of cancer and leukemia, and a skewed birth sex ratio [8,10].

For Gaza, there have been several reports of excess birth defects together with measurements of elements in hair, including Uranium [20,21]. The authors did not single out Uranium as a cause, but rather seemed to believe that the effects were due to some “heavy metal” effect. It is reasonable from the Fallujah results and other studies of the Iraq and Balkan populations, that these weapons are effectively genetic destruction weapons.

Conclusion and further investigation.

An inevitable deduction from the consistent findings of enriched Uranium in samples from Gaza, Lebanon and Iraq, is that a neutron weapon of some kind has been employed since the second Gulf War, and possible before then. This is an Israeli secret weapon, as reported by Robert Fisk in the Independent in 2006 [7]. The increases in congenital effects seen in the Fallujah population [8,9,10] and also in Gaza [20,21] can plausibly have resulted from exposure to neutrons as well as the Uranium particulate aerosols. The weapon is ideal for armies employed in methodological destruction both of fighters hidden in urban environments (where neutrons pass through walls) and for any State that has the aim to destroy the civilian population using a genetic mutation weapon (cancer, fertility loss, birth defects). It is, however, a nuclear weapon and those deploying it are using a nuclear weapon against civilian populations as part of a cynical project to destroy an enemy State population without acknowledging this, and this is a war crime.

The problem that exists is that the laboratories where samples are measured, using the very expensive equipment necessary to obtain relevant results are mostly funded directly or indirectly by government and the nuclear military complex. Furthermore, as this author found in the case of the UN investigation of the Lebanon craters, the laboratories used by the UN, in this case the Spietz lab in Switzerland, which measured the split samples obtained by Green Audit in 2006 do not tell the truth.

Furthermore, as this author also knows, Scientific Journals often either refuse to publish contributions that address such politically sensitive topics, or their reviewers dismiss results.

In the case of a recent paper reporting increases in Uranium from the Ukraine war in February March 2022, found in High Volume Air Samplers deployed at the Atomic Weapons Establishment Aldermaston, UK which was sent to two journals, the first Journal flatly refused to accept it, the second sent it to a reviewer and then dismissed it. Yet the raw data showing the significant increase in Uranium particles in the air were supplied to the journals: a child would have seen the increases.

But the public have access to simple methodology, and at minimum can record radiation increases near any impact site and now report this on videos that they can upload to the internet. This development, the use of the neutron weapon is a very large ethical and public health issue.

References

1. Royal Society (2001) The Health Effects of Depleted Uranium Munitions. Part 1. London: Royal Society

2. Royal Society (2002) The Health Effects of Depleted Uranium Munitions. Part 2. London: Royal Society

3. World Health Organization (2001) Depleted Uranium: sources exposures and health effects. Geneva: WHO https://www.who.int/publications/i/item/WHO-SDE-PHE-01.1

4. International Atomic Energy Agency. Depleted Uranium
https://www.iaea.org/topics/spent-fuel-management/depleted-uranium

5. Green Audit (2001) Depleted Uranium in Kosovo Samples. Commissioned Report for Nippon TV Japan (Unpublished)

6. Depleted Uranium Oversight Board (DUOB) (2007) Final Report of the UK Ministry of Defence Depleted Uranium Oversight Board.
https://webarchive.nationalarchives.gov.uk/ukgwa/+/http:/www.mod.uk/DefenceInternet/AboutDefence/CorporatePublicationsHealthandSafetyPublications/Uranium/FinalReportOfTheDepletedUraniumOversightBoard.htm

7. Robert Fisk, The Independent. The mystery of Israel’s secret Uranium bomb
https://www.independent.co.uk/voices/commentators/fisk/robert-fisk-mystery-of-israel-s-secret-uranium-bomb-6230359.html

8. ALAANI, S., AL-FALLOUJI, M., BUSBY, C*., HAMDAN, M.. Pilot study of congenital anomaly rates at birth in Fallujah, Iraq, 2010. Journal of the Islamic Medical Association of North America, North America, 44, Aug. 2012. Available at:
<http://jima.imana.org/article/view/10463>.

9. Alaani Samira Tafash Muhammed, Busby Christopher*, Hamdan, Malak and Blaurock-Busch Eleonore (2011) Uranium and other contaminants in hair from the parents of children with congenital anomalies in Fallujah, Iraq Conflict Health 5, 1-15

10. Busby, Chris*; Hamdan, Malak; Ariabi, Entesar. (2010) Cancer, Infant Mortality and Birth Sex-Ratio in Fallujah, Iraq 2005–2009. Int. J. Environ. Res. Public Health 7, no. 7: 2828-2837.

11. Abd Elkader MA, Shinonaga T, Sherif MM (2021) Radiological hazard assessments of radionuclides in building materials, soils and sands from yhe Gaza strip and the north of the Sinai peninsula. Nature Scientific Reports (2011) 11:23251.

12. Busby C and Williams D (2006) Evidence of Enriched Uranium inn guided weapons deployed by the Israeli military in Lebanon in July 2006. Green Audit Research Note 6/2006,Oct 20 2006. Aberystwyth: Green Audit. 

https://www.researchgate.net/publication/265064420_Evidence_of_Enriched_Uranium_in_guided_weapons_employed_by_the_Israeli_Military_in_Lebanon_in_July_2006_Preliminary_Note

13. Busby C, Williams D (2006) Further evidence of enriched Uranium in guided weapons employed by the Israeli Military in Lebanon in July 2006. Ambulance air filter analysis.

Green Audit Research Note 7/2006, November 3 2006 Aberystwyth : Green Audit
https://www.researchgate.net/publication/228485893_Further_Evidence_of_Enriched_Uranium_in_guided_weapons_employed_by_the_Israeli_Military_in_Lebanon_in_July_2006_Ambulance_Air_Filter_Analysis 

14.Vignard K (2008) Disarmament Forum 3. Uranium Weapons. Geneva: UNIDIR

15. Williams D (2006) Eos weapons study in Lebanon, September 2006-Interim Report. Eos Surrey UK. www. eoslifework.co.uk

16. Emilio Del Guidice, Theoretical Physicist 1940-2014. See
https://en.wikipedia.org>wiki>Emilio_del_Guidice

17. Dr Frank Barnaby. Interviewed in Channel 4 Documentary 1993 Does Red Mercury Exist? Despatches goes on its trail. Youtube: https://youtu.be/ESCTZETN4-8?si=ZIIXVIegTNBUhjrZ

18. Wikipedia entry for Neutron Bomb has considerable information relevant to the
discussion. See: https://en.wikipedia.org>wiki>Neutron

19. See: https://en.wikipedia.org>wiki>cold_fusion and loc.cit.

20.Naim A, Al Dalies H, El Balawi M et al (2012) Birth defects in Gaza: prevalence, types, Familiarity and correlation with environmental factors. IJERPH 9(5) 1732-1747

21 ManducaP, Daib SY, Qouta SR (2017) A cross sectional study of the relationship between exposure of pregnant women to military attacks in 2014 in Gaza and the load of heavy metals in the hair of mothers and newborns. BMJ Open 7(7) e014035


New Neutron Bomb used in Gaza, Lebanon and Iraq:

Source: https://www.researchgate.net/publication/377297280_Red_Mercury_Evidence_for_the_use_by_Israel_of_a_novel_uranium_warhead_in_Palestine_and_Lebanon

MASSIVE Global Technology Outage Grounds Flights, Takes Down Banks, Medical Services and 911


 

Daisy Luther
July 19, 2024

A massive technology outage is wreaking havoc worldwide. Early reports blame the chaos on an issue with Microsoft, the dominant operating system for desktop computers and the software Falcon from CrowdStrike. We’ve been assured this isn’t a cyberattack, but a software glitch affecting the OS of computers across the world.

Here are some of the ways this outage is causing global chaos today.

Flights grounded

American Airlines, United, Frontier, and Delta made a drastic move, grounding all flights.

The ground stop impacted all flights from the airlines, regardless of their destination, the Federal Aviation Administration said. As of mid-Friday morning, more than 600 flights into, out of or within the United States had been canceled, according to FlightAware.com.

The outage does NOT impact flights that were in the air when it occurred. The problem is with the other processes, such as check-in, luggage, and booking.

American Airlines has since resolved the problem, and its flights are resuming. But for tens of thousands of other travelers, chaos and uncertainty still reign in the airports.

As well, in New York City, the Metropolitan Transit Authority has also had its systems go offline.

Banking disrupted

Around the world, operations have ground to a halt at many major banks, including:

  • Commonwealth Bank in Australia
  • Capitec in South Africa
  • Certain operations at Barclay’s in the UK
  • Banks in Israel, which have not been named at the time of this post

Some of the issues have been the inability to use ATMs, access online banking, and self-checkout registers. As well, the London Stock Exchange website is unable to update.

911 outages

According to Newsweek, 911 services in many parts of the country have been affected.

Emergency services and access to 911 operators have been affected in multiple states as a major IT outage causes chaos worldwide.

Alaska State Troopers posted on Facebook on Thursday night: “Due to a nationwide technology-related outage, many 911 and non-emergency call centers are not working correctly across the State of Alaska.”

They added that in the case of emergency, alternative numbers can be contacted.

New Hampshire also experienced a service outage overnight, but it has since been restored, according to the Department of Safety, local ABC affiliate WMUR reported.

In Phoenix, Arizona, emergency services have faced further issues. As some computers stopped working, dispatchers had no access to the internet or their usual systems, and were manually writing down information for first responders, according to local news outlet The Arizona Republic.

Downdetector has also noted a spike in reports about 911 services.

Medical services interrupted

Health systems and hospitals around the world have reported issues with their systems as well. Israeli Health Services has reported problems and NHS has said that most general practitioner offices in the UK have been affected.

News broadcasts down

In some areas, major mainstream media outlets are at a complete standstill.

Australian and British broadcasters SBS, Network 10, the ABC and Sky News Australia and Sky News UK were all taken off air.

What caused all of this?

CNN reports that the issue originated with a software update issued by CrowdStrike. It is specifically affecting Falcon, one of CrowdStrike’s main cybersecurity software products.

CrowdStrike’s cybersecurity software — used by numerous Fortune 500 companies — detects and blocks hacking threats. Like other cybersecurity products, the software requires deep-level access to a computer’s operating system to scan for those threats. In this case, computers running Microsoft Windows appear to be crashing because of the faulty way a software code update issued by CrowdStrike is interacting with the Windows system.

George Kurtz, the CEO of CrowdStrike says that they have identified the problem and deployed a fix. Here’s the statement he released to the media:

CrowdStrike is actively working with customers impacted by a defect found in a single content update for Windows hosts. Mac and Linux hosts are not impacted. This is not a security incident or cyberattack.

The issue has been identified, isolated and a fix has been deployed.

We refer customers to the support portal for the latest updates and will continue to provide complete and continuous updates on our website. We further recommend organizations ensure they’re communicating with CrowdStrike representatives through official channels. Our team is fully mobilized to ensure the security and stability of CrowdStrike customers.

One little glitch caused mass chaos.

Regardless of the cause, this certainly goes a long way toward reminding us how much our systems are reliant on computers. While they do make many things far easier, many businesses are entirely unable to function without them.

It’s just a glimpse into the disaster we could face with a significant cyber attack or other service outage.

Hopefully, this will be shortlived but also provide a cautionary tale for the businesses affected.

Are any of your services affected?

Have you experienced any issues related to this glitch? If so, can you tell us about it? Do you think there is a problem aside from the software?

Let’s discuss it in the comments section.

Source: The Organic Prepper