Watch TV. Pick up a newspaper. Check your devices. You can’t miss them. A new study. A recent report. A just-completed survey. Even research into the impact of technology on learning. Most of these measurements are meaningful and serve a purpose. They are usually steeped in science, supported by reputable institutes and agencies and introduce compelling findings. We as viewers, lISTEners and knowledge consumers need to do our part, however, and continue to pursue the story.
Here’s an example: In the last few months, we’ve been told by a variety of news outlets and websites that coffee can reverse the effects of liver damage. Coffee can help prevent colon cancer. Coffee can help decrease the risk of endometrial cancer. And coffee can increase the risk of miscarriage.
According to the various results, coffee wields enormous power. No doubt, there are great truths to be learned, but we should look beyond the headlines and dig into the details. What exactly was studied and how? What was the purpose? Perhaps the real value of all these reports is the dialogue and discussion they spark. These findings start us thinking, and thinking solves problems.
More specifically, the analysis of the OECD Programme for International Student Assessment (PISA) data in the report finds that “despite the pervasiveness of information and communication technologies in our daily lives, these technologies have not yet been as widely adopted in formal education. But where they are used in the classroom, their impact on student performance is mixed, at best. In fact, pisa results show no appreciable improvements in student achievement in reading, mathematics or science in the countries that had invested heavily in technology for education.”
OECD countries and economies include, but are not limited to, France, Italy, Spain, Korea, China (Hong Kong and Shanghai), Japan, Turkey, Brazil and the United States – quite a cross section of nations.
The report was based on data gathered in 2012 that covered a wide variety of students, cultures, economies and technologies, and provided a broad view – think a mile wide but a foot deep – that needs greater examination.
A conversation starter
“This [report] is valuable to start a conversation, but dangerous to use as a conclusive headline,” says Jim Flanagan, ISTE chief learning services officer. “The OECD information is interesting; now we need to build on these findings by getting more granular detail to inform what works and what does not.”
In other words, it’s another tool in figuring out how best to integrate technology to help students achieve.
“What OECD did is very reasonable,” adds Rick Hess, the director of education policy studies at the American Enterprise Institute. “It’s wise to do these comparisons across nations, but this is a snapshot, and we need to be cautious about putting too much weight on a snapshot.”
The aim, then, is to turn a snapshot into a feature-length motion picture through better analytics, assessments and measurements. OECD did its job and measured bluntly, which was necessary, as it was never designed to measure the level of detail truly required. But a more surgical touch is required to get at the root issues.
“Look, we need to find better ways to measure how students are doing, period,” Flanagan explains. “We need to get data faster and learn from it. Other industries have developed better ways to identify individual needs, prescribe solutions and measure impact, so shame on us. There are no technical barriers – we just need the will to invest in better assessment and analysis.”
Educators cite the need to improve and harness real-time, growth-based formative assessments that are more immediate and offer small data instead of the macro look that usually happens only at the end of the year.
A good place to start is to make sure educators are using the right technology in the first place. Rapid-cycle technology evaluations may prove invaluable as educators continue to search for the best ways to measure technology integration. Rapid-cycle tech evaluations are partially designed to assist school leaders in making evidence-based decisions regarding education technology purchases. The evaluations help get the right technology into the right hands quickly to help students.
The current education environment as it exists in the classroom, especially where technology is involved, simply moves too quickly to meet the needs of real measurement. Traditional research approaches take too long and cost too much, and by the time a conventional research project is complete, new iterations of the app can render the research outdated before it is even published. In fact, the OECD report is from 2011-2012, a time when smartphones with their easy access to knowledge were less common.
“Technology is fabulous. I love technology, but we need to think carefully about how we are using it. We need to use technology to push the boundaries of what we can do in education and not just repeat old practices,” says Helen Crompton, an assistant professor at Old Dominion University, via a video chat. “Furthermore, we are in the 21st century; however, a lot of these tests [like OECD] are often using 20th century formats. There are parts of the OECD where students may be using technology, but they are still requiring students to just regurgitate facts and not actually think critically.”
There are many efforts underway to be more thorough. For example, the u.s. has a program dedicated to using short-cycle evaluations to assist districts that are moving toward transforming education through the use of technology. The U.S. Office of Educational Technology (OET), a part of the Department of Education, is using short-cycle evaluations in Future Ready, “a set of online tools designed to help educators evaluate technology,” according to the oet website.
Specific programs are also emerging from the private sector, and some of these are being supported by a variety of organizations and endowments. Karen Johnson is a senior program officer with the Gates Foundation, and her lens for education technology is to look at the foundation’s investments in math and literacy, with a keen eye toward helping administrators and districts choose the best technology tool for them.
She points to Zearn, a comprehensive k-5 math learning experience designed to work with teachers to create a personalized learning experience for every student, as a program that serves two important purposes.
“First of all, Zearn provides exit tickets so the teacher can see how they’re doing at the end of each day and understand each student’s progress,” she says. “It’s not only immediate, it’s a full-course curriculum. Many times ed tech tools are supplemental, leaving teachers confused about how to use them and when to deploy them. This makes it clear.”
Getting a sharper view
While how we measure the integration of technology is important to get a better picture and a clearer understanding of this issue, what we measure is equally important to get an even sharper view.
It’s one thing to talk about the lack of integrating technology effectively, but are we truly educating our children? Are they career-, college- and world-ready? Improving measurements are clearly necessary. The OECD report is an integral part of the process, but what are we missing?
“Speaking from a u.s. perspective, we know that many employers, like engineering firms, don’t necessarily care about language arts standardized test scores,” says Brandon Olszewski,
senior education consultant with ISTE. “They care about skills that show a workforce readiness, like the ability to solve problems, which I believe is also a global concern. Are students across the globe ready to contribute?”
“Even as we have more students graduating from high school and college, we are not educating enough of them to fill the jobs in today’s labor market. Where are the measurements to answer why?” Flanagan asks.
Olszewski adds that ISTE believes that technology access alone doesn’t determine or shape student achievement. The success lies in how students use technology – and how they’re taught – to shape their experience as learners. That’s why it’s valuable to find out how test scores, like the ones noted in the OECD report, correlate with other outcomes of interest, he says. For example, at a national level, how do scores relate to the number of patents granted per capita?
The start of a dialogue
Meanwhile, the macro nature of the report also spawned a very important dialogue.
“The report is partly about a measurement, but in its own way, it also calls out the need for practicing strong digital citizenship,” Olszewski says. “This will get us thinking more broadly beyond our borders. We can no longer be isolated. It’s time we all speak more than one language, whether it’s English and Scratch or something else. We’re all together.”
Thanks to technology, the world is smaller. Working as one to improve student outcomes is more necessary than ever. It’s not a district, student or school problem. It’s a “we” problem, and teachers in particular play a starring role.
Right tech, right training
So if we’re going to talk about how we measure and what we measure, we need to talk about the on-the-ground experts who should lead the way. Teachers need the right tools and they need ongoing support.
“We wouldn’t give a teacher who has never played an instrument an expensive violin and expect them to teach students how to play,” says Crompton. “Furthermore, we wouldn’t give the teacher one lesson and then expect them to be effective. Technology is the same thing. Yes, let’s buy and invest, but it needs to be the right technology, and the training needs to follow.”
“We have to focus on professional development,” says Olszewski. “We have to make sure our teachers are prepared.”
That preparation is certainly aided by ISTE resources like the ISTE Standards, which support educators, students and leaders with clear guidelines for the skills and knowledge necessary in the digital age; and the ISTE Essential Conditions, the 14 critical elements necessary to effectively leverage technology for learning.
So, coffee: good or bad? The answer is both and all of the above, and it’s up to us to keep searching for more information and find the right answers. For educators, it’s very much the same, with the caveat being that coffee is an individual’s choice and education is a collective concern.
Nevertheless, it’s still wise to avoid the report and study whiplash and dig down for more details, better testing and more varied categories of what is tested.
“Here we are, more than 30 years into the integration of technology in education, and we still have a long way to go,” Flanagan says. “We have moved at a glacial pace; reports like OECD are absolutely necessary. Large institutions, particularly public ones like schools, are not designed to change quickly, but anything that moves us along is helpful. Resources like the ISTE Standards and Essential Conditions are supporting change, but we have a lot of learning to do to accelerate improvements. We need much better data to collect and analyze more quickly so now it is far too early to reach any conclusions.”
Tim Douglas is a former television news producer who also served as a senior media consultant for several speakers of the california state assembly. today, douglas is a freelance writer who covers a wide range of topics.