There
is a lot of discussion about science nowadays, but is what the media dubs
“scientific” immune to ideology and subjective interests? And to what extent
are hegemonic scientific narratives pseudo-scientific?
Urmie Ray
(published 19th November 2022)
What is science?
There is a lot of confusion about what science is and what it is not. So let us start by discussing the actual definition.
Understanding the reality we live in
To survive, we have to understand the physical reality in which we live. So from the beginning, our species, worldwide, has been trying to approach this challenge. To understand means being able to describe, to explain and to predict. A description tries to answer the question of “how?” and this may take the form of a mathematical formulation, but this is far from necessary. An explanation tends to answer the much more difficult question of “why?” These three aspects are closely related. The quality of predictions depends on the quality of descriptions and explanations. And, conversely, the better the description, the better the explanations and predictions.
The difference between a scientific approach and a religious approach essentially lies in the nature of the explanations. It is likely that our ancestors, or one of their Gods, did not allow us much foresight, nor thereby any protection against the elements. At the same time, the increasing domestication of fire went hand in hand with a growing understanding of this phenomenon.
Thus, an approach to the world that we perceive through our senses and our minds has long been distinguished from other approaches. The knowledge obtained had to be communicable to others, so that they too could verify it, and in particular, be able to predict as systematically as possible.
I understand that some people are convinced of the power of prayer, but spiritual knowledge, even if it is based on experiences, remains personal and not communicable.
Thus, the approach that has gradually developed into what has been called science since the 19th century is based on two tools: observation and reasoning.
But as our earliest schools of natural philosophy realized long before the Christian era, the question of reliability is central to this approach. To what extent does our interpretation correspond to the given physical reality?
Reproducible observation
Limited observation may not be reliable. For example, just because all observed swans are white, does not mean they are all white. There are black swans. Therefore, observation must be reproducible at will, especially for it to be verifiable.
Consequently, the field of scientific investigation is limited to describable characteristics, that is, those that can be compared. Absolute concepts, such as heat or height, are certainly part of our life experience. But they have no place in science. For there is no way to check how two different individuals perceive them. What can be communicated and agreed upon is that one object is hotter or colder, larger or smaller than another. In other words, only those attributes that can be compared to an external reference can be subject to scientific examination. Therefore, absolute concepts like God are off limits because they cannot be described by comparison.
This is also the problem with Newtonian physics. It is based on a notion of time and space that is absolute. Newton was aware of this. But it was the physicist Ernst Mach who first really understood the meaning of this deficiency at the end of the 19th century. It was based on his work that Albert Einstein developed his theory of relativity.
Let us add an important observation: it is not enough to know the intrinsic properties of a phenomenon. It is also necessary to have an idea of your interactions with the environment. Hence the need to study it both in isolation in a laboratory and in its natural environment, in time and space. Some effects and their meaning may not be immediately visible.
Reasoning
On the other hand, if we remain tied to our limited immediate experience, we will have to collect facts all the time and will not be able to predict. Therefore, science aims to obtain unified descriptions and explanations of disparate phenomena.
Reasoning is what prevents us from being tied to this process of constant trial and error. It must also be communicable. Therefore, it must be based on consensual methods that have varied over time or even from school to school.
Concepts
Finding explanations that unify various phenomena forces us to base our theories on concepts – forces, atoms, genes, etc. – which are not necessarily the same ones we are used to. We have no way of determining whether these are characteristics of our reality or constructions of the human mind to account for experiences common to all of us. They are scientifically indefinable except through the use of other concepts. However, and this is fundamental, the relevance of concepts must be confirmed by effects that can be perceived and described. They should be replaced by more appropriate concepts if observation and reasoning so require.
In summary
Science is the reasoned study based on the reproducible and sufficiently replicated observation of properties describable by comparison of the perceptible world and the perceptible effects attributed to the evolving and changing concepts thus deduced – a study that includes interactions with the natural environment.
We all have a scientific mind. The difference between a scientific approach and a conventional approach is one of degree, not type. It brings precision and systematization where they did not exist. For example, we all have an idea of the difference between fruits and vegetables, based on vague, sometimes cultural, reasons and comparisons. The scientific definition of a fruit is an organ that contains seeds, protects them during their development and helps them to disperse. Therefore, tomatoes and cucumbers are fruits, contrary to popular belief that they are vegetables.
In other words, science clarifies the similarities and differences between comparable objects, taking observation beyond the superficial. In this way, ambiguities are reduced.
It is essential to emphasize that our conclusions cannot be fanciful and inexplicable deductions. They cannot be considered scientific until they are supported by reasoned arguments. Satisfactory results may be pure coincidence. On the other hand, reasoning that is not based on solid empirical evidence is not science.
Mathematics
This is why mathematics is not a science. Science arose from the human ability to make sense of an otherwise chaotic reality by attributing patterns to it. Mathematics was born from the study of patterns that are quantifiable.
In order to master the interrelated notions of quantity and space, our ancestors developed the concepts of integers and geometric objects, excluding the particular nature of the objects in question and maintaining only their quantity or shape. They must have realized that if our only interest is the quantity and not the other qualities of the objects in question, then there is no difference between two fingers and two equal lengths, but that these differ from five fingers. In other words, numbers arose from our recognition of patterns relating to quantity, making our most basic reasoning tool, comparison, as effective as possible.
Without going into more detailed discussion, I will say that mathematics is the logical study of relationships between abstract concepts, based on the notion of numbers.
Weaknesses intrinsic to science
These characteristics of science make our scientific knowledge very precarious.
Simplification and approximation
Even the most holistic approach is a simplification. The human mind is incapable of encompassing the entirety of an unfathomably complex nature. All our deductions, all our observations, all our measurements are only an approximation of reality.
These problems are exacerbated in mathematical theories. A hypothesis must first be expressed in a common language. The process of translation into mathematical symbolism is accompanied by a great loss of information. It eliminates everything that is not quantifiable from the start. Therefore, the further we get from the inanimate world, the less appropriate a mathematical description becomes. Even among quantitative characteristics, a choice must be made. Mathematics can only deal with a very limited number of parameters, and only a very simplified version of their relationships. Thus, a mathematical model reflects reality only very imperfectly.
The rapprochement process goes even further. Although equations have exact solutions in theory, in all but the simplest cases we can only solve them approximately. This is generally the case with differential equations, that is, equations that indicate the evolution of a system in time and space and are therefore the basis of predictions. A whole series of approximations occurs again when translating our mathematical theory back into everyday language, i.e. its application in concrete reality, especially since it is likely to involve non-exact numbers such as √2 or π.
Furthermore, the mathematical part can, as in quantum physics, have more than one scientific interpretation.
In conclusion, the perfect precision inherent to mathematical formalism allows us greater control over certain quantifiable characteristics, but precisely because of this precision, it is very far from reality.
To quote Einstein, “To the extent that the propositions of mathematics refer to reality, they are uncertain; and to the extent that they are precise, they do not refer to reality.”
Unpredictability
It is not surprising that unpredictability follows even in the simplest deterministic theory:
Consider the following example constructed by physicist Max Born. A particle moves without friction along a straight line of length l between two walls. When it reaches the end of the line, it ricochets. Suppose that your initial position is given by the point x0 on the line and your initial velocity is v0 and that the inaccuracy of our initial measurements is Δx0 and Δv0. According to Newton's first law, at an instant t, it must be at the point x = x0 + tv0. However, according to the same law, our prediction of its position at time t will deviate from this value by Δx = Δx 0 + t Δv0. So our error will continue to increase over time. After a critical time tc = l/ Δv0, this deviation will be greater than the length l of the line. In other words, for any time t > tc, we will not be able to predict the position of the particle at all. It can be anywhere on the line.
We can improve our measuring instruments and reduce initial inaccuracy, but we can never completely get rid of it. All we will do is extend the time range in which prediction is possible.
This example concerns a simple and ideal closed system. In the real world, countless factors are involved, worsening unpredictability. Basically, due to inevitable errors, our ability to know what is happening beyond a certain time may be limited to such an extent that no amount of technical progress can be overcome.
In our computerized calculations, small errors can propagate and grow. This is because the coded way in which a computer approaches internal calculations involves a rounding error. The error also occurs when the coded language result is translated back to the printed form on the screen.
Observation in the computer age
Computerization also adds new questions to the act of observation. It has been known since before the Christian era that observation, as a result of a complex collaboration between our senses and our mind, is far from neutral and can be misleading.
Since then, observational instruments have introduced a whole series of new complications, despite the unsuspected possibilities that have opened up. In addition to the introduction of errors, studying events in our four-dimensional space-time from one- or two-dimensional symbolic representations raises the issue of information loss. Most importantly, computers are composed of algorithmic processes represented by 0's and 1's, and are therefore severely limited by oversimplified assumptions. They cannot go beyond that, they cannot infer. So we wonder if they can only detect what fits our preconceptions.
In fact, the problem worsens as the observation process becomes increasingly automated, thus eliminating the human observer: the machine observes and interprets. It's even worse when observation is removed and conclusions are based on simulations rather than real experiments, as is increasingly the case. These problems raise many questions about our knowledge of the microscopic world. It depends entirely on instruments. We have virtually no unfiltered representation to compare the image they give us. Furthermore, in order to observe it, samples are often not only taken from their environment, but also have to be prepared, for example, using a staining technique. An adulteration then occurs.
Generalization
All of this calls into question the process of generalization, that is, deducing principles from data that can only be limited. The problem of generalization is even more serious because the observation can be replicated, but it will never be the same. So how similar must the results be to be accepted as justification for a given conclusion? The question arises all the more because we are not simply trying to deduce the color of swans from repeated observations, but to deduce basic principles from observations of a wide variety of different cases. Too little data can lead to wrong models and therefore wrong predictions.
The greater the number of parameters, the greater the sensitivity of the results to the initial conditions, the less we can expect the results of our experiments to come close. Furthermore, results may depend on the interpretation and protocol applied. Obtaining consistent results can therefore be difficult. So how many times must an experiment be replicated before its results can be accepted?
Basically, the question of when experimental verification can be considered satisfactory has no clear answer. It cannot necessarily be said that it should depend on the success of its applications, as its drawbacks may take some time to be noticed. Even when a hypothesis is developed in the best scientific spirit, serious flaws can remain unidentified for decades, precisely because our observation remains limited, if only for technical reasons.
When is it reasonable to apply a hypothesis, that is, to construct new hypotheses based on it or to use it technologically?
Hypotheses
There can be no science without hypotheses. We must first have established a relationship with the universe before we can even think scientifically. In other words, metaphysics always precedes science. More generally, science remains based on assumptions that are forgotten because they are hidden and have become too familiar. These can strongly influence the theories we develop.
For example, mathematical predictions imply integration. Behind this concept is the assumption of uniformity, according to which processes would remain the same across time and space. This assumption is the basis for all generalizations. For the Buddha, uniformity was assumed to be very limited. It was Democritus who introduced its most extreme version as a basic scientific principle. Galileo remained cautious. It was reaffirmed first by physicists in the 17th century and then by geologists, for whom the rates of geological processes remained the same over time.
However, due to unpredictability, we have no idea how well the uniformity holds. It is, therefore, better to be cautious with distant phenomena.
Furthermore, uniformity over time has been challenged by geological discoveries since the 1960s that suggest that unique cataclysms have critically altered existing conditions in our planet's history.
The limits of science
For all these reasons, although it is the least fanciful form of knowledge, we cannot know whether science can lead us to truths. Our scientific understanding is constantly being deepened. Therefore, it keeps us away from untruths. In fact, science cannot consciously tell us untruths. At all times it must comply with all known data. We improve our approximations, of course. But, in the infinity of the world, does this bring us closer to any truth?
Doubt is therefore characteristic of a scientific approach. Science challenges conventional wisdom. The importance of doubt has been emphasized by scientific thinkers of all times and traditions. Theories should not be rejected, but their acceptance should not be passive.
To proceed scientifically is to recognize that science is a “philosophy of nature”, even if it is different from other philosophies in that it “questions nature itself to obtain answers to what nature is”. To proceed scientifically is to hope that our scientific thoughts are in harmony with nature, as otherwise we would be incompatible with the given conditions of life, but it is also to recognize that science is far from being objective. It always presupposes the existence of man, and we must realize that we are not mere observers, but also actors on the stage of life.
From science to dogma
Staying on the scientific path requires caution. It is easy to dodge this.
However, sincere errors should not be confused with dogmatism. It is through error that we advance, all the more so because in each era, in each culture, science is influenced by existing thoughts and observation techniques. Thus, it is an anachronistic interpretation of Newtonian physics to apply our current understanding to it and consider it false. It's still satisfactory enough for some common phenomena, as long as the speeds involved are well below the speed of light.
That said, the nature of science has been highly appreciated for millennia, for example by certain schools of thought in ancient India. It was also the subject of heated discussions at the turn of the 20th century, when issues of positivism became clear. Thus, the deformation of modern science into dogma is facilitated by its intrinsic weaknesses, but to understand it, it must be placed in the economic context.
In the 19th century, market capitalism was transformed into financial capitalism. The profit-maximizing perspective that was gradually established required ceaseless material growth and thus increasingly efficient production.
As a result, technology must rely on advanced research to increase efficiency, resulting in constant and increasing changes to our environment. It is becoming less and less suitable for human life. The fact that something appears temporarily viable does not guarantee its compatibility with maintaining the conditions necessary for human life in the medium and long term, or even in the short term: health and environmental problems quickly followed and continued to increase. A stage was then reached where, in order to stay the course, research increasingly lost its scientific nature and began to betray science itself.
In other words, science has turned into pseudoscience. It is a set of principles that claim to be science, but that do not have the characteristics of science, in particular that they are not based on reasoning based on observation, reproduced and reproducible. It is, therefore, a belief.
Current research may too often be described as such. The extent of abuses is difficult to measure because a basic condition – transparency – without which there can be no science, as conclusions remain uncontrollable, is now commonly ignored, under the pretext of competition or state secrecy.
According to Richard Horton, editor of the prestigious Lancet, “much of the scientific literature, perhaps half, may simply be wrong. Plagued by small-sample studies, minuscule effects, invalid exploratory analyses, and blatant conflicts of interest, as well as an obsession with following fads of dubious importance, science has taken a dark turn.”
Unfounded conclusions
Let's look at two examples.
1) Based on purely mathematical and theoretical assumptions, it was extrapolated from experiments with electromagnetic radiation (EMR) “in the visible, ultraviolet and X-ray bands”, that is, with “frequencies above the lower limit of the infrared”, that all EMR is quantified, that is, it consists of photons. It was only in 2015 that this claim was experimentally verified and found to be wrong for EMR below the lower infrared limit, which includes all EMR from our antennas – one reason why this radiation is harmful to human health.
2) The viral thesis is also an hypothesis. Unlike the case of EMR quantification from antennas, it has not been proven to be false. But no particle has ever been observed to first be in the air, then enter the body, and become the source of a disease. Therefore, the virus remains a concept. Is this a useful hypothesis? Perhaps unicorns or ghosts are useful hypotheses to explain certain phenomena. But a scientific conclusion must be based on reproducible observation, and this is certainly not the case. And so we can develop the concept of an antivirus, like a vaccine. But you cannot materially manufacture a vaccine against something that has not been proven to exist. And you can't play politics on assumptions that remain unchecked to this day. The debate over the relevance of a particular hypothesis must remain internal to the scientific world, and this is how we evolve in our understanding.
Financial temptations
Research is today fully embedded in market capitalism, which it made possible and continues to make possible. Financial gain has become a primary motive in a culture where more and more researchers are creating private companies themselves to financially exploit their results.
As a result of deliberate policies to shift research funding from public to private bodies, many members of the hierarchy of the new Church of Scientism are personal beneficiaries of the greatness of the various interest groups. Corruption is now endemic and conflicts of interest seriously undermine research activities. Only conflicts directly related to a particular work should be disclosed, i.e. any direct funding that could influence its conclusions. This obligation is easily circumvented: favors can take many forms, from appointments as consultants to membership on company boards. When the generous donors to universities, research labs and scientific societies include some of the most powerful multinational conglomerates, can any work done within their walls be truly selfless?
From knowledge to production
This corruption goes hand in hand with the transformation of the objectives of science since the turn of the 20th century to meet the demands of financial capitalism: from understanding the objective became to produce. The early 20th century saw the emergence of the researcher-technologist, first in chemistry, then in physics, and now in biology.
This subjugation of research to the economic ideal is maintained in particular by a culture of awards. This was initiated by a leading industrialist in the emerging military-industrial complex, Alfred Nobel, precisely at a time when the control of research became essential. This culture helps bring forth individuals and subjects that are dedicated to this ideal. It isn't easy to recognize the strong subjectivity underlying the decision-making process here because, unlike science, the new creed of pseudoscience professes a valueless objectivity.
This does not mean that exceptional works are never properly recognized. But it is preferable that they contribute to the maintenance of economic objectives. Einstein only won the Nobel Prize for his work on photoelectric effects.
But the awards have contributed to the rise of pseudoscience, since this is what can sustain production.
For example, one of the first Nobel Prizes (in chemistry) was awarded to Fritz Haber for the synthesis of ammonia. However, the method of producing artificial molecules does not reproduce the natural process and, therefore, their geometry differs from their natural counterparts. The correct scientific approach would therefore have been to study its impact on the environment and human health.
Marie Curie received the prize twice, so it would be normal to believe based on the prize's criteria that her works are more important than Einstein's. They are certainly more important from a profit perspective. Her objective was circular: to study the properties of radioactivity in order to constantly increase production. All the tragedies associated with this work were developing radiotherapy as a treatment for cancer. Here again we have a circular pattern: the application of radiation to alleviate a disease that was relatively rare compared to other diseases and whose prevalence helped increase.
Increasingly, research has become a matter of colossal machines requiring colossal funding in a few colossal locations. Thus, it is based on unique experiences that cannot be replicated at will, not least due to the necessary infrastructure. This over-reliance on technology makes us forget that artificially created processes in laboratories may very well not correspond to their real-life equivalents.
For example, in the 1950s it was discovered under laboratory conditions that organic matter could emerge from what could roughly be described as a methane soup. Because of this success, it has been forgotten that this does not imply that this is how it happened. And in fact, the first experimental study to reconstruct this early atmosphere based on real empirical evidence, carried out in 2011, indicates that, on the contrary, it may not have been as poor in oxygen as previously thought.
An inversion of the relationship between mathematics and science
The gradual takeover of science by pseudoscience is reflected in the gradual inversion of the relationship between science and mathematics.
With the growing importance of industrial production, mathematics acquired greater primacy within science because it is through the measurable that science can be converted into technology.
The first big step was the birth of computer science due to the demands of technology when physics and mathematics and technology were amalgamated into a single field. This synthesis was certainly very constructive, but as it was carried out from a profit perspective, it also helped to maintain its maximization. In this process, mathematics quietly took the driver's seat.
Mathematical applications depend on our purpose and are not limited by the need to conform to reality, unlike science. Thus, the cession of leadership to mathematics largely contributed to the emergence of pseudo-science. This inversion is also the consecration of the materialist perspective, since mathematics cannot take into account the factor of life.
The second step was the creation of bioengineering, one of the fastest growing sectors. Life came to be seen as a huge computer whose underlying programs can be transformed at will. Thus, the mechanistic view of nature was adapted to the new technological phase we have entered.
Mathematics via computer science is now taking its process to the final stage, where intelligence is reduced to a measurable quantity and knowledge to information flows, that is, to artificial intelligence, which in the end should lead us to transhumanism – total fusion of life with the machine in the driver's seat.
There is, however, a basic error with a machine-made virtuality that denies our given reality. Since “realities” are not “ghosts,” writer Charles Dickens warned that there was “a greater danger of them swallowing us up” sooner or later.
Science and Future
Therefore, the problem we face is the deformation of science into a pseudo-science responsible for man-made dangers. On the other hand, despite all its weaknesses, an approach based on observation and reason is certainly the most appropriate for the study of perceptible reality. To reject science is to renounce the wonderful possibility of unraveling some of nature's mysteries, even if only superficially, even if we always end up discovering that our previous conclusions were not entirely correct. To reject science is to reject our main survival tool.
Education
It is therefore essential to first distinguish science from its deformation. To do this, it is necessary to develop some appreciation not only of the technique of science, but also of its nature. Science is not a matter for experts. The amateur must claim his right not only to understand, but also to judge according to his own lights. Everyone is able to understand the ideas behind the technical part. The best way to learn how to do this is to read the works of pioneering scientific minds. Who better to explain the how and why of the ideas they helped develop.
However, the only real way to dispel the confusion between science and pseudo-science is to ensure that the education of future generations feeds our innate scientific intuition. Assimilating the spirit of science is learning to think for yourself based not on dogma, but on an adequate assessment of the range of information available. This requires that instructions on technique be placed in the context of a discussion about the nature of science.
There are many ways to do this and not all are suitable for every student. This is why pluralism is essential in the type of education offered, both at school and at university.
Reduce the scope of harmful research
The ethical question
So far, the debate has focused on ethical issues. However, the problems have not been resolved. On the contrary, they are getting worse.
Ethics certainly influences science. Basing science on values that lead to a more peaceful future may seem like the best way forward. But is this really the case? What should these values be?
Ethical debates remain ineffective. On the other hand, restricting research within any ethical framework is harmful to science. Setting limits on the human mind erodes the creative dynamism essential to civilizations. Creativity takes unpredictable forms, therefore, it must be given free rein.
The debate should be moved to a less controversial level
Man-made dangers are the result of clearly unscientific research. The debate must therefore be about the scientific nature of the research. It is true that science cannot be expected to be defined precisely or for there to be sufficient consensus. However, it is possible to clearly identify what is not science. There is research that is contradicted by studies with a solid empirical basis, research based on observations that cannot be reproduced at will, or whose conclusions are based on reasoning that does not correlate with the data provided. In particular, this would greatly reduce or even eliminate controversial experiments.
For example, the field of medicine continues to be based on animal experimentation, despite the fact that this has been repeatedly denounced on ethical grounds for over a century. Today, as in the past, the answer is that human lives are saved as a result. However, due to biological dissimilarities between other species and ourselves, extrapolation to man is rarely justified and may even be harmful. In other words, these experiments are superfluous and science is certainly not about doing experiments just for the sake of doing them.
At the same time, transparency of all research must be legally required. This is far from the case today.
Research and money
That said, the raison d'être of pseudo-science is the pursuit of profit. The link between money and research must therefore be broken. This means, of course, that anyone with material interests must be prevented from exerting undue influence over the research. Anonymity would prevent the donor from choosing the recipient. For the purpose of science it was transformed into a means of making profit. But it goes further: scientific activity itself has been transformed into a wealth-generating activity, thanks to the development of the notion of intellectual property and patents. As a result, the value of scientific work now depends on the amount of money it generates. It's not that science is above or below money, it's simply not related to it. Thus, the choice of an inappropriate criterion for science has contributed to its distortion. Therefore, it would be useful to discuss the abolition of patent laws and how to achieve this.
Excessive specialization
Overspecialization not only impedes the correct study of the most fundamental and pressing issues, as they often span multiple fields, but also the identification of these issues.
One way to remedy this problem is to transform universities into small academic communities without any subject barriers and to make academic studies less specialized. Naturally, the number of years of study will increase. But the current speed of training is the result of the mentality of recent centuries. It has lost its relevance with a longer useful life and technologies that free us from various tasks, mechanizing them.
What science for the future?
As far as science itself is concerned, the question is what form it should take. The goal should be to reduce your weaknesses.
Maurits Escher's images show how little we are capable of understanding the complexity of reality. Even in the case of two intertwined patterns, the human brain can only observe one at a time. In other words, any light from a perspective only illuminates certain aspects. These aspects may even look different from different perspectives.
But every form of science is based on assumptions. Therefore, each of them may miss critical aspects. Science must therefore be restored in all its diversity.
Starting with a synthesis of the different forms that science has taken over the course of history could prove useful and lead to radically new ways of thinking. It would be foolish to jointly reject the vast reservoir of knowledge already developed in different cultures at different times and grope in the dark.
The relevance of ancient approaches in the modern context is underlined by the example of mathematician Srinivas Ramanujan: the results he obtained following a tradition dating back to the Vedic era have proven to be essential in the most sophisticated modern physics.
Naturally, proven methods should not be abandoned, but supplemented by others, taking into account the many changes in our perception of reality brought about by science itself.
In short, as in education, it is only through the return of true pluralism that we can attempt to overcome some of the gaps in human understanding.
Scientific applications
Only after a broad and deep theoretical understanding can we begin to think about technological applications. As Ralph and Mildred Buchsbaum proposed half a century ago, “the burden of proof…of the absence of significant harm to man” should be legally “placed on the man who wants to bring about some change.” Today, proof of actual harm must be provided by victims. But it is unrealistic to rely on science to identify the exact cause of harm. In fact, science is generally unable to untangle the increasingly complex web of causes and pinpoint a culprit. Or, when it does, it is a long process. Meanwhile, damage is being done, sometimes irreversibly. Too often, there are still reasonable doubts. This puts people at the mercy of legal judgments based on technicalities and the opinions of those who make them.
Once the public has given its consent to a certain type of application, even more careful experiments must be carried out to ensure that the side effects do not have a negative impact on us. In other words, we need to give ourselves time before introducing new elements into nature; Only carefully conducted experiments in natural environments and over sufficiently long periods of time can help us distinguish between applications whose main problem is excessive use and which can therefore be used within certain limits, and applications that present other problems.
Conclusion
Let's return to the initial question: what is science if it is in constant flux?
It is the human mind's disorderly but heroic attempt to understand the workings of the universe by persisting in the face of insurmountable obstacles, in the face of elusive understanding, despite countless failures and errors. These errors, in turn, give rise to new questions that must be answered. Scientific knowledge is the closest thing to certainty, but it is unable to offer certainty because certainty is incompatible with our human condition.
Despite the dominance of pseudo-science, real science has continued to make its way. During the last two centuries, the science we have developed has undermined the belief in a manifest reality of material substances interacting according to mechanically rigid rules. From a reality of isolated substances, each thing began to be seen as part of a whole. This whole cannot be reduced to the sum of its parts. On the other hand, no part can be explained independently of the whole. And yet, each individual part has its own meaning and reflects the whole in different ways. In short, our scientific understanding increasingly takes into account the complexity of our reality.
It is up to us to restore science to its rightful place in a supportive environment where scientists are finally free to focus on constructive topics of their choosing. Nowadays, many have to waste their talents and efforts to combat the lies that are crafted and propagated in the name of science.
In this context, industrial activities will also not necessarily be harmful, but beneficial to humanity.
Dr. Andrew Kaufman & Urmie Ray – Undermining the viral theory
Source: https://strategika.fr/2022/09/25/quand-la-science-devient-pseudo-science-urmi-ray/