Ed-Tech, Is it A Pedagogical Innovation or Character Obliteration?

Remember when smartphones were banned from schools because they distracted the students from focusing on their lessons and were disrupting the natural flow of learning? Those days are over.
Today, it has become a propaganda that we infuse the pedagogy with technology for a more interactive, engaging and personalized learning. Educational technology companies are selling their products like pancakes, claiming that it is efficient to assist students with knowledge acquisition and marketing devices while throwing apps for free. However, not the whole bandwagon favors the idea.
The University of Colorado recently issued a report enumerating the myriad of anomalies with this personalized learning approach. Not to put the kibosh on, but to help them understand the consequences.
The theory? These ed-tech devices are being fed to schools to give the impression that these high-end, state-of-the-art tools are useful; but in fact, it exploits the student’s data privacy and undermines what education is supposed to be. The ed-tech devices “effectively funnel children into a ‘surveillance economy’ which harvests their data for profit, given that these tools are poorly-designed to keep safe information about its user and only focuses on how liberating a “personalized” learning is.
The marketing strategy of Ed-tech is a juggernaut -now a monopoly. Edtech intensifies corporate marketing through creating profiles on students, harboring a colossal amount of data to be bargained, transferred and sold. In fact, few states that limit the sale of these confidential information permits transfers during acquisitions or the verge of bankruptcies.
This software, using the merits of artificial intelligence, can depict a child’s brain activity and how the individual responds to various situations through detecting the child’s heart rate and facial expression. The tech gurus feed the acquired data into algorithms, which Colorado researchers defined as “theories that reflect which piece of information from algorithms’ authors are considered valuable and how their authors believe those pieces should form a conclusion”. Such algorithms can foresee the student’s disposition and behavior, such as plans to cut classes, drop out of school or flunk courses- defining “personalized learning”.
The report is an expose of pitfalls in the algorithmic world for algorithms are created using the creator’s own views- inhibitions, preferences, biases, and beliefs to mimic human intuition. Therefore, as perceptions differ from one individual to another, this standard does not apply to all situations.
Another hitch is the absence of transparency. Observing proprietary processes no third party can interfere or verify the veracity or the accuracy of the algorithm.
Apparently, the creators have not figured out how to code perceptions and reactions that individual teachers would have as stimuli. If it detected no result, there wouldn’t be any feedback to gauge the pace of learning.
The third issue is that “most companies do not field test products before shipping them to schools, nor do they conduct significant research to validate their claims.” The educational institution does not thoroughly investigate the learning tools they employ and often just fall for some sugar-coated marketing. These decisions aggravate the student’s capability to learn, seemingly making them like “specimens” to see whether the product works or not.
“Even if you trust everyone spying on you right now, the data they’re collecting will eventually be bought or stolen by people who scare you. We cannot secure large data collections over time,” Maciej Ceglowski, a technology entrepreneur harbingered these issues about mediocre data security. In fact, the report lists present staggering figures about the lax protections offered by tech companies, only 46 of 152 privacy policies that were studied reported using encryption to protect the data.
The report reflected the same ethical questions a parent might raise. Should schools take student security for granted just because it is an emblem of “in-depth” and “revolutionary” form of learning? Is surveillance economy the new principle to produce industry-reliable individuals under the siege of constant monitoring? Should education and tutelage be re-invented to that point where on-the-job-trainings are the only “physical part”? And should these biased algorithms be the standard of universities to have students admitted?
The report concludes with recommendations for requiring algorithm-powered education software “to be openly available for examination by educators and researchers,” putting it into scrutiny by independent third parties to study both its accuracy and bias.
However, the report should have more specific revisions that include information – such as full explanation to parents that explains the collection of data, the algorithms, and with who they share the data. If the data is secured what is the probability of how reliable the information is. Additionally, it is necessary to gain the parents permission for their child to participate.
Footnote:
Robbins, Jane. “Is Technology Really Better for Education?” The American Spectator, American Spectator Foundation, 27 Sept. 2017, spectator.org/is-technology-really-better-for-education/.
