Analysis of information, propaganda, and elements of programming

thorbiorn

The Living Force
FOTCM Member
Recently I saw the word propaganda appearing in the thread Yugoslavia - What Really Happened Before I decided to make a thread about the topic, I looked it up the Forum in title, and everywhere, but everywhere on its own gives a limited overview because there are so many results. Here are the search results for many of the topics on the Forum:

Cassiopaean sandbox
What's on your mind see search
Movies & Trivia: Picks & Pans see search
Music see search
Books see search
Baked Noodles see search

Science
Outer Space and "Inner Space" Sciences see search
Psychology & Cognitive Science see search
Earth Changes see search
Linguistics see search (A few)
Environmental Issues see search
Diet and Health see search

Esoterica
The Work see search
Memories, Dreams, Reflections see search
The Cassiopaean Experiment see search

The Real World
Our Orwellian World see search
History see search
Religion see search
Geopolitics see search
Conspiracy Theory see search
COINTELPRO and Disinformation see search
Conspiracy Theory see search
911 see search

SOTT
SOTT News Discussion see search
Sott Talk Radio Show! see search
Suggest an Article for SOTT - and Discuss see search

Not to forget SOTT as a resource, below are search links to articles that somewhere mention propaganda:
Title: 1587 articles
Summary: 3963 articles
Text: 12062 articles
Comment: 2769 articles
Title and summary: 562 articles
Title, summary, and text: 390 articles,

There is a lot of research on propaganda and the techniques used. In part probably, because the designers of propaganda also need to understand how it is done, what works, and for whom.

Those undertaking to study communication and propaganda may find their examples for analysis on the right, the left, or the center of the political spectrum. Therefore when reading examples of propaganda analysis some may have an interest in focusing on the signs of propaganda of what they perceive as the other side, being uninterested in the side they are more connected to or sympathize with. As an example, while some fact-checking and misinformation sites might have a mission to accomplish which is to denigrate an opponent, I don't think all the ideas for fact-checking such sites advertise or use are inherently useless. Much depends on how they are used and from what perspective and knowledge background. Some people think that all propaganda is inherently bad, but I disagree. A slogan for instance can be based on a lie. An example from COVID times could be: "Get vaccinated, protect your loved ones." But it can also be based in truth, or be helpful, as when teachers wish to drill into students the elements of hygiene: "Hygiene is in your hand", road safety: “Stay Alert, Stay Alive”, or first aid: "Safety brings first aid to the uninjured".

If one is aware of how propaganda works and can identify examples it might be more easy to resist the intentions behind it, especially if it is nefarious or not in one's best interest. It might also assist in understanding how other people react to the information they take in, or as a result of it.
 
A little about propaganda techniques

A page that calls itself the Propaganda Analysis Project has a copy of this article; Fine-Grained Analysis of Propaganda in News Articles, [but find also a copy here]. written by a team of authors: Giovanni Da San Martino, Seunghak Yu, Alberto Barrón-Cedeño, Rostislav Petrov, Preslav Nakov. It reads as if they aim to develop a program that can detect propaganda. Here are excerpts from their paper with a few comments.
Propaganda aims at influencing people’s mindset with the purpose of advancing a specific agenda. Previous work has addressed propaganda detection at the document level, typically labelling all articles from a propagandistic news outlet as propaganda. Such noisy gold labels inevitably affect the quality of any learning system trained on them. A further issue with most existing systems is the lack of explainability. To overcome these limitations, we propose a novel task: performing fine-grained analysis of texts by detecting all fragments that contain propaganda techniques as well as their type. In particular, we create a corpus of news articles manually annotated at the fragment level with eighteen propaganda techniques and we propose a suitable evaluation measure. [...]
[...]
it has been observed that propagandistic sources could post objective non-propagandistic articles periodically to increase their credibility (Horne et al., 2018). Similarly, media generally recognized as objective might occasionally post articles that promote a particular editorial agenda and are thus propagandistic. Thus, it is clear that transferring the label of the news outlet to each of its articles, could introduce noise. [...]
It is true that sites that often post loaded articles also can articles, that have fewer elements characteristic of propaganda and which are largely true. Even a propaganda rag may publish something of value.

The article goes on to introduce propaganda and its techniques
Propaganda comes in many forms, but it can be recognized by its persuasive function, sizable target audience, the representation of a specific group’s agenda, and the use of faulty reasoning and/or emotional appeals (Miller, 1939). Since propaganda is conveyed through the use of a number of techniques, their detection allows for a deeper analysis at the paragraph and the sentence level that goes beyond a single document-level judgment on whether a text is propagandistic. Whereas the definition of propaganda is widely accepted in the literature, the set of propaganda techniques differs between scholars (Torok, 2015). For instance, Miller (1939) considers seven techniques, whereas Weston (2018) lists at least 24, and Wikipedia discusses 69.2 The differences are mainly due to some authors ignoring some techniques, or using definitions that subsume the definition used by other authors. Below, we describe the propaganda techniques we consider: a curated list of eighteen items derived from the aforementioned studies. The list only includes techniques that can be found in journalistic articles and can be judged intrinsically, without the need to retrieve supporting information from external resources. For example, we do not include techniques such as card stacking (Jowett and O’Donnell, 2012, page 237), since it would require comparing against external sources of information.
They list and explain the different categories of propaganda techniques they use and with examples, but one can also find them with definitions and slight variations in this list of 20 which must have been written by one or more of the same authors.
Maintaining the list of 18 and adding explanations where it could be needed more, but consulting the explanations in both list, there is:
1) Loaded language
Using specific words and phrases with strong emotional implications (either positive or negative) to influence an audience.
2) Name calling and labelling
3) Repetition
4) Exaggeration and minimization
5) Doubt
6) Appear to fear/prejudice
7) Flag-waving
Playing on strong national feeling (or to any group; e.g., race, gender, political preference) to justify or promote an action or idea
Example 1: "patriotism mean no questions" (this is also a slogan)
Example 2: "entering this war will make us have a better future in our country."
8) Causal oversimplification
Assuming a single cause or reason when there are actually multiple causes for an issue.
9) Slogans
10) Appeal to authority
11) Black-and-white fallacy, dictatorship. The relation to dictatorship is explained:
Presenting two alternative options as the only possibilities, when in fact more possibilities exist (Torok, 2015). As an extreme case, telling theaudience exactly what actions to take, eliminating any other possible choice (dictatorship).
12) Thought-terminating cliché
Words or phrases that discourage critical thought and meaningful discussion about a given topic. They are typically short, generic sentences that offer seemingly simple answers to complex questions or that distract attention away from other lines of thought (Hunter, 2015, p. 78).
Examples: It is what it is; It's just common sense; You gotta do what you gotta do; Nothing is permanent except change; Better late than never; Mind your own business; Nobody's perfect; It doesn't matter; You can't change human nature.
13) Whataboutism
Discredit an opponent’s position by charging them with hypocrisy without directly disproving their argument (Richter, 2017).For example, mentioning an event that discredits the opponent: “What about . . . ?” (Richter,2017).
14) Reductio ad Hitlerum. This term is not clear on its own, but it means:
Persuading an audience to disapprove an action or idea by suggesting that the idea is popular with groups hated in contempt by the target audience. It can refer to any person or concept with a negative connotation.
Example 1: "Do you know who else was doing that ? Hitler!"
Example 2: "Only one kind of person can think in that way: a communist."
15) Red herring, presenting irrelevant data
Introducing irrelevant material to the issue being discussed, so that everyone's attention is diverted away from the points made.
16) Bandwagon
Attempting to persuade the target audience to join in and take the course of action because “everyone else is taking the same action” (Hobbs and Mcgee, 2008).
17) Obfuscation, intentional vagueness, confusion
18) Straw man, misrepresentation of someone's position
When an opponent’s propositionis substituted with a similar one which is then refuted in place of the original (Walton, 1996). Weston (2018, p. 78) specifies the characteristics ofthe substituted proposition: “caricaturing an opposing view so that it is easy to refute.”
For more details, see the source links or try the Wiki for Propaganda techniques

In the paper, they try to pin down the instances of propaganda techniques in the articles they analyzed down to the number of characters used. The following table shows the frequency and the length of the expressions that provided what they counted as evidence.
2024-07-18 180519.png

Looking for instances of propaganda techniques in articles and finding them can probably be a valuable tool, but what a focus on articles alone might miss is the tendency of a publication and the synergy created by the juxtaposition and presentation of different articles on the same page or media outlet. Now we can also add the filtering done by fact-checkers and information controllers. What complicates how a text works for a receiver are also the cognitive biases. More on this another time.
 
Before continuing, a dive into the world of fact-checkers

This is a slight detour, but it is necessary.

In the previous post, I quoted excerpts from Fine-Grained Analysis of Propaganda in News Articles. Their article used a selection of articles evaluated by a company Media Bias/Fact Check.
On their about page they say:
About Media Bias / Fact Check
Founded in 2015, Media Bias/Fact Check (MBFC) is an independent website that has promoted awareness of media bias and misinformation by rating the bias, factual accuracy, and credibility of media sources, large and small. Media Bias/Fact Check relies on human evaluators to determine the bias of media sources and the level of overall factual reporting through a combination of objective measures and subjective analysis using our stated methodology.

Dave Van Zandt is the founder and primary editor for sources. He is assisted by a collective of volunteers and paid contractors who provide research for many sources listed on these pages. Finally, MBFC also provides occasional fact-checks and original articles on media bias and publishes daily curated fact-checks from around the world.
Credibility
The credibility of a website/media source is not determined by who owns them but rather by its track record. Everybody starts as a beginner and, through experience, becomes an authority in their field. MBFC is no different. Over the last 8 years, we have proven to be a trusted authority on the rating of bias and the credibility of media sources. For example, MBFC is trusted by major media outlets and IFCN fact-checkers. This is evidenced by frequently being referenced by sources such as USA Today, Reuters Fact Check, Science Feedback, Washington Post, and NPR, among dozens of others. We are also frequently used as a resource in libraries, high schools, and universities across the United States.

Media Bias/Fact Check has also been used as a resource for research by the University of Michigan and the Massachusetts Institute of Technology. Further, we have been utilized by numerous print books such as these:
Finally, MBFC scored a perfect 100/100 rating by Newsguard, which rates the credibility of Media Sources. We believe it is significant that a competitor gave us this score.
IFCN was mentioned, but what is IFCN? The acronym stands for International Fact-Checking Network
They say about themselves:
What is the International Fact-Checking Network?
The International Fact-Checking Network (IFCN) at Poynter was launched in 2015 to bring together the growing community of fact-checkers around the world. The network advocates for information integrity in the global fight against misinformation and supports fact-checkers through networking, capacity building and collaboration. IFCN’s network reaches more than 170 fact-checking organizations worldwide through advocacy, training and global events. Our team monitors trends in the fact-checking field to offer resources to fact-checkers, contribute to public discourse and provide support for new projects and initiatives that advance accountability in journalism.

We believe truth and transparency can help people be better informed and equipped to navigate harmful misinformation.
And their about page explains further:
The International Fact-Checking Network (IFCN) at Poynter has been a champion for the best practices and exchanges of fact-based information since it was established in 2015. Today, the IFCN is a global leader in fact-checking excellence, providing more than 100 operations around the world with:
Opportunities to connect with fact-checkers from countries around the world at events like IFCN Talks, Global Fact and International Fact-Checking Day.
The Wiki List of fact-checking websites makes it more clear how the IFCN began:
International Fact-Checking Network
International Fact-Checking Network launched in 2015 by the Poynter Institute set a code of ethics for fact-checking organizations. The IFCN reviews fact-checkers for compliance with its code, and issues a certification to publishers who pass the audit. The certification lasts for one year, and fact-checkers must be re-examined annually to retain their certifications.[2] IFCN lists 170 organizations as members as of July 2024.[3] Facebook and Instagram have used the IFCN's certification to vet publishers for fact-checking contracts.[4][5]
About Poynter Institute, the Wiki has
The Poynter Institute for Media Studies is a non-profit journalism school and research organization in St. Petersburg, Florida, United States. The school is the owner of the Tampa Bay Times newspaper and the International Fact-Checking Network.[2][3] It also operates PolitiFact.[4]
Next is an example of what the authors of the article from the last post warned about, the general labeling of a site, rather than including analysis of individual articles
Poynter published a list of over 515 news websites that it labeled "unreliable" in 2019. The author of the piece used various fake news databases (including those curated by the Annenberg Public Policy Center, Merrimack College, PolitiFact, and Snopes) to compile the list and called on advertisers to "blacklist" the included sites. The list included conservative news websites such as the Washington Examiner, The Washington Free Beacon, and The Daily Signal as well as conspiracy outfits including InfoWars.[17] After backlash from both readers of and contributors to some of the included publications, Poynter retracted the list, citing "weaknesses in the methodology".[18] Poynter issued a statement, saying: "[w]e regret that we failed to ensure that the data was rigorous before publication, and apologize for the confusion and agitation caused by its publication."[19] Reason pointed out that the author was a freelancer hired by the Institute who typically works for the Southern Poverty Law Center (SPLC). Reason drew parallels between the accuracy of the list with SPLC's own work on hate groups.[17]

Election integrity and COVID-19
In 2020, after receiving funding from Facebook, the Poynter Institute expanded the MediaWise program with a national media literacy program called MediaWise Voter project (#MVP). Its goal was to reach 2 million American first-time voter college students, helping them to be better prepared and informed for the 2020 elections.[20][non-primary source needed]

The Poynter Institute received $737,400 in federal loans from the Paycheck Protection Program during the COVID-19 pandemic. President Neil Brown noted that this was not the first time the institute received government funding, noting past training contracts with Voice of America.[21]
Mentioning COVID, the IFCN has on their website the image:
2024-07-19 165837.png
Following the link leads to a page that has this introduction:
Fighting the Infodemic: The #CoronaVirusFacts Alliance
Led by the International Fact-Checking Network (IFCN) at the Poynter Institute, the #CoronaVirusFacts united more than 100 fact checkers around the world in publishing, sharing, and translating facts surrounding the COVID-19 pandemic. The Alliance was launched in January 2020 when the spread of the virus was restricted to China but was already causing rampant misinformation globally. The World Health Organization classifies this as an infodemic — and the Alliance was on the front lines in the fight against it. The project concluded in early 2023.
Their fact-checking prowess stopped in early 2023, and they probably did not go back to recheck their pronouncements:
To give an impression of how they worked, here are some headlines in the following screenshot from this page:
2024-07-19 170752.png
Some of the checks these groups and companies made led to the bans and FB prisons many are familiar with. What they achieved was to stifle the debate about the wisdom of implementing the COVID policies. Sure, many claims were initially anecdotal, some wrong, but to expect normal people to come up with peer-reviewed double-blind testing before they should be allowed to utter their suspicions is in my opinion just plain wrong.

Most recent, there is on this page:
2024-07-19 171021.png
Scrolling down on this page, one finds at the time of posting:
2024-07-19 171854.png
Incidentally, and only to comment on articles one and three, it just so happens that Georgia is where the EU, the US, and NATO have been working for years. Readers of SOTT will know the background for what is going on, but to include some of the headlines for the sake of context, here is:
US wants regime change in Georgia - Russian intel which has excerpts from other articles and also the following additional headlines:
As to the second article about populism, it reads like an echo of what Ursula von der Leyen and Emmanuel Macron have expressed. Not that they are all wrong in that their opponents, which they call populists are gaining ground, but it has little to do with democracy and more about preserving their power.

For more comment on the fact-checking industry, consider the points that Mike Benz makes in this SOTT Focus: The End of Democracy: "What I'm Describing is Military Rule"
You couldn't get a story killed. You couldn't have this favors for favors relationship. You couldn't promise access to some random person with 700,000 followers who's got an opinion on Syrian gas. And so this induced, and this was not a problem for the initial period of social media from 2006 to 2014 because there were never dissident groups that were big enough to be able to have a mature enough ecosystem on their own. And all of the victories on social media had gone in the way of where the money was, which was from the State Department and the Defense Department and the intelligence services. But then as that maturity happened, you now had this situation after the 2016 election where they said, okay, now the entire international order might come undone. 70 years of unified foreign policy from Truman until Trump are now about to be broken.

And we need the same analog control systems. We had to be able to put bumper cars on bad stories or bad political movements through legacy media relationships and contacts we now need to establish and consolidate within the social media companies. And the initial predicate for that was Russiagate. But then after Russiagate died and they used a simple democracy promotion predicate, then it gave rise to this multi-billion dollar censorship industry that joins together the military industrial complex, the government, the private sector, the civil society organizations, and then this vast cobweb of media allies and professional fact checker groups that serve as this sort of sentinel class that surveys every word on the internet.
There might also be something about fact-checkers in Fake News Overlords in the EU

I think it is possible to learn something from the fact-checkers, they have resources, suggestions, and strategies, but to get hung up on their evaluations and conclusions and consider it the ultimate authority is not helpful. As mentioned, the way they were used to suppress the debate about COVID was not productive. The suspicions of many people may initially have been poorly founded, but they turned out to be more right than wrong. And with the fact-checkers not owing up to their role in this affair they have proved themselves to be little more than mercenaries for a move by governments and large multinational corporations to impose control on digital content and implement the policies that they prefer.

This post turned out to be more related to the politics of propaganda research than what this thread was intended to be, but not saying or commenting on the fact-checking industry would also be wrong, especially when some claim to be qualified to preside as unelected judges of what is permissible and what is not. The industry does not stand alone; what their results are used for is also determined by politicians, some of which are psychopaths.
 
What if two sides with opposing or different views accuse each other of propaganda? Can there be a meeting point?

Propaganda can be considered as communication, Various researchers have tried to describe what is involved in communication. The Wiki for Models of communication, has several. Here are a few examples:

Common_components_of_models_of_communication.svg.png
Many models of communication include the idea that a sender encodes a message and uses a channel to transmit it to a receiver. Noise may distort the message along the way. The receiver then decodes the message and gives some form of feedback.[1]
Propaganda understood as communication has a sender has a message that is put through a channel or media of some kind and is later decoded by the receiver.

What the above model does not capture well is what happens inside the sender or the receiver. There are other models for that:
Barker_&_wiseman_-_model_of_intrapersonal_communication_-_text.svg.png

Model of intrapersonal communication by Larry L. Barker and Gordon Wiseman. The left side of the diagram shows the start of the process: external and internal stimuli (red and violet arrows) are perceived. This triggers various cognitive processes (green areas) involved in the interpretation of the stimuli. These processes result in the generation and transmission of new stimuli, which are again perceived.[40]
What I miss in the above model is that both sender and receiver have knowledge and that this influences what is sent, and what is received. David Berlo in his linear one-way model includes the element of knowledge:
SMCR_model_-_full.svg.png
Berlo's model includes a detailed discussion of the four main components of communication and their different aspects.[143][144]
The above models are either one-way or back-and-forth. Openness and the possibility of change are missing from these models. I am not convinced it is always present, but for those where it happens, this perspective on communication by Frank Dance has a point,
Dance's_helical_model_of_communication.svg.png
Dance's helical model understands communication in analogy to an upward-moving and widening helix.
The description in the Wiki reads:
Dance holds that a helix is a more adequate representation of the process of communication since it implies that there is always a forward movement. It shows how the content and structure of earlier communicative acts influence the content and structure of later communicative acts. In this regard, communication has a lasting effect on the communicators and evolves continuously as a process. The upward widening movement of the helix represents a form of optimism by seeing communication as a means of growth, learning, and improvement.[164][165] The basic idea behind Dance's helical model of communication is also found in education theory in the spiral approach proposed by Jerome Bruner.[166] Dance's model has been criticized based on the claim that it focuses only on some aspects of communication but does not provide a tool for detailed analysis.[164]
What all of the above miss, is that people may get a message from an authority, a news media, a party, a book etc, and apart from the internal decoding in each individual, there can sometimes also be networking between individuals, who share their interpretations. These exchanges may help to change the understanding, and interpretation of the original message and it is information content and in some cases lead to a different response back to the sender, than had the receiver interpreted the message, be it propaganda or not, all on his or her own.

maxresdefault.jpg

To come back to the question, I posed in the beginning: What if two sides with opposing or different views accuse each other of propaganda? Can there be a meeting point? Maybe the two can not agree, but it is possible to describe the interaction with communication models.

In a previous post, I ended with:
What complicates how a text works for a receiver are also the cognitive biases. More on this another time.
A sender who knows the cognitive biases of his audience can use them to advantage, it is similarly possible to promote biases that can lay the foundation for the communication of future messages. There is also the question of truth. What one source says may not necessarily be intentional propaganda, it may be a sincere sharing of perspective, but opponents may still project malefic intent.

Alternatively, a receiver who recognizes cognitive biases has the opportunity to be more aware when interpreting communications from a sender. The same goes for a receiver who has more knowledge about the subject communicated, as knowledge may help in interpreting a message. Communication models can help to describe what goes on in communication, but it could be interesting to take some of the cognitive biases and think about how they might work when people, organizations, and governments convey information.
 
Alternatively, a receiver who recognizes cognitive biases has the opportunity to be more aware when interpreting communications from a sender. The same goes for a receiver who has more knowledge about the subject communicated, as knowledge may help in interpreting a message. Communication models can help to describe what goes on in communication, but it could be interesting to take some of the cognitive biases and think about how they might work when people, organizations, and governments convey information
Interesting material. Using these models over the years, it is useful to highlight just exactly what causes and contributes to so-called 'noise': cognitive biases (and other perception-affecting factors) are definitely part of this noise.

Interesting when it comes to Communication Studies, the techniques to reduce noise at the individual psychological level is barely, if ever, addressed.

Model- noise.png

Another interesting model is the Concentric model, which also shows the 'noise' factor, but also highlights the effects of 'filters':

concentric hub model of mass media.png

In 1974 scholars, Ray Hiebert, Donal Ungurait and Thomas Bohn presented a new model [as opposed to the Shannon-Weaver information Model of 1948]:
- a series of concentric circles with the encoding source at the centre. One of the outer rings was the receiving audience. In between were several elements that are important in the mass communication process, but less so in other communication processes.


GATEKEEPERS: Mass communication is not a solo endeavor. Dozens, sometimes hundreds of individuals are involved. Anyone who can stop or alter a message enroute to the audience is a gatekeeper.


REGULATORS: The concentric circle model also recognizers regulators as a force that shapes and reshapes mass communication messages before they reach the mass audiences. Regulators are non-media institutions that influence media content. The Canadian Radio-television and Telecommunications Commission is a government agency that serves as a regulator. The EU and countries within the EU have these regulators, as in other countries around the world.


FILTERS: Herbert, Ungurait and Bohn (1974), note that receivers are affected by a variety of filters in decoding a message. They call the language or symbols used for message information filters. Physical filters exist when a receiver's mind is dimmed by fatigue or noise. Psychological filters would be a male and female experiencing the same film but having two different interpretations of the movie.


EFFECTS: A decoded message can do more thsasn proimpt verbal feedback. I can affect how someone votes, to how they dress or even provoke a riot. In terms of violence, it can make people actually feel ill or disgusted.


AMPLIFICATION: The mass media have the ability to amplify, which is related to gatekeepers. Amplification is a process by which mass communication confers status to issues and personalities merely covering them. For example, Barbara Walters or Diane Sawyer or Connie Chung, Peter Jennings, Noltan Nash etc.

Status conferral is not limited to news personalities but to music, videos, TV personalities, films etc. The 1960 movie "Guess Who's Coming to Dinner" helped keep racial integration on the American Agenda
For educational purposes, I think it is useful to use and show these tried and true models to prove not only how they have been used to influence, persuade, and manipulate for decades with advertising, but how they are used to persuade and manipulate the masses towards a particular agenda.
 

Attachments

  • concentric hub model of mass media.png
    concentric hub model of mass media.png
    676.7 KB · Views: 1
Huge topic. which I personally feel is of vital developmental importance. Given the type of forum, I suggest to consider perhaps at a later time, the two-way relationship of mutual support content re-launch and stabilising the real, between the 3d informational field and the 4d informational field, in this case STS. And the role played in this realm by social media.
 
Using these models over the years, it is useful to highlight just exactly what causes and contributes to so-called 'noise': cognitive biases (and other perception-affecting factors) are definitely part of this noise.

Interesting when it comes to Communication Studies, the techniques to reduce noise at the individual psychological level is barely, if ever, addressed.
Can the avoidance, or absence, be explained as originating in subject definitions and course constrictions, although it is possible that some societies would be more interested in having trained people who know how to influence, and can be paid to do that, but do not recognize or care much about how they too are being influenced.
For educational purposes, I think it is useful to use and show these tried and true models to prove not only how they have been used to influence, persuade, and manipulate for decades with advertising, but how they are used to persuade and manipulate the masses towards a particular agenda.
Maybe in some groups, it would be possible to give examples of cognitive biases and noise factors with short descriptions and invite participants to recall instances from their own experiences. One could then discuss if some factors are more hardwired, and if some are more due to upbringing, health, education, and trends in society. Can "noise" be compensated for, resisted, or accepted?

In some countries, it might not work, if politically incorrect examples are brought out, in which case "a lower key might play better". Every situation is different. Some concepts we learn take time before they find application, corresponding to the last stage of the structure of observed learning outcomes.
Extended abstract – The previous integrated whole may be conceptualised at a higher level of abstraction and generalised to a new topic or area. At this stage, students may apply the classroom concepts in real life.
When it comes to learning and understanding biases or "noise", patience with self and with others can be conducive.
 
Back
Top Bottom