I woke up early again this morning, all worried about this upcoming election. I started mucking around my old 2009 Macbook Pro and found the Federal Elections Commission web site and their downloadable files with details on campaign contributors by state. Data makes my skin tingle.
So I downloaded all 27 megabytes of North Carolina data (4/15/15-10/31/15), loaded the csv file into Open Office Calc and started tinkering. My seven year old MacBook was huffin’ and puffin’.
One of the questions that got my mind going this morning was the money that is so essential to political campaigns today. To date, the 2016 presidential campaigns have generated $1,000,058,201 from individual donations alone. More to the point of my sleeplessness was, “Who’s paying for these campaigns?” or “Who’s buying our government?”
So I used Calc to parse the 133,100 contributions by range categories: less than $100, $100 to $999 and more than $1000 and more. It shouldn’t be a surprise that more North Carolinians were donating less than $100 than the other two combined.
What struck me as especially critical to my worries was the total amounts of campaign money generated from each category. Look at the data and graph.
|Donations||Number of Contributors||Total Amount Contributed|
|Less than $100||101,388||$2,737,190.87|
|Between $100 & $1000||28,427||$5,454,833.10|
|More than $1000||3,285||$6,226,996.52|
What’s wrong with this picture? Well, let’s say you are an incumbent, or even a challenger. With so much money out there, constituting a elections industry, the only way that you can keep your seat, or oust the incumbant is with a lot of money.
Where do you go for the money?
Look at the diagram again. Where’s the money? To get elected, you have to convince rich people and corporations to contribute. What will they want from you for that money?
It’s their government. Not ours.
We hear it just about everywhere and every time we turn around –– STEM. The country (USA) desperately needs more scientists, Technologists, Engineers and Mathematicians. It’s our way of securing our superiority and prosperity and ramping up S, T, E & M instruction in our schools is the way to succeed.
In preparing for a talk to parents in suburban Edmonton, Alberta this week, I searched for data on Canadian college graduates and the degrees conferred to them. In the process, I ran across a report from the U.S. Institute of Education Sciences.* I copied a data table called Bachelor’s degrees conferred by degree-granting institutions, by field of study, and converted it to an Open Office Spreadsheet (ODS) file to see what I might learn from the data.
The table offered the number of graduates receiving degrees from 32 fields of study, from selected years between 1970 and 2010. I devised and ran formulas that calculated the percent of change in the number of degrees by decade. I also created an additional set of rows that calculated the percent of each years total graduates receiving specific degrees to factor out the effects of changes in the total number of graduates. When sorting the degrees by the percent of increase from 2000 to 2010, the rank was somewhat surprising.
At the bottom of the list, the fields showing the least growth, was Computer and Information Sciences. Though the 1970s saw an impressive increase in computer science degrees (469%), the increase dropped to 42% during the 80s, 33% in the 90s, and then a decline (-32%) during the first decade of the 21st century.
Other fields suffering declines were education, and english and literature/letters, both bested slightly by Engineering technologies, which fell only 17% (-17% change). Falling less than that were agriculture, architecture, liberal arts, sciences, general studies and humanities, topped by engineering, with a 6% (-6% change) decline. Just better than engineering was theology and religious vocations (-5% change).
Enjoying substantial increases in degree from 2000-2010, from high to low, were communication technologies; military technologies; legal professions; parks, recreation, leisure and fitness; homeland security, law enforcement and firefighting; library sciences; and visual and performing arts. (see graph)
|Click Graph for Larger Version|
This was a fairly startling discovery to me, considering the funding, resources, and time invested in STEM education and its cost to other subject areas, not to mention the political capital gained from reciting the mantra to constituents and voters.
It the results were such a surprise that and I’ve questioned my math several times, checking and rechecking the formulas. I invite you to double check my spreadsheet [here].
If this is, indeed, an indication of our students’ interests in science, technology, engineering and mathematics during the early 21st century, then is STEM education doing what its suppose to do –– even if test scores are rising?
Please double and triple check my spreadsheet. and if you find problems with my formulas, please post them in my comments.
* United States. Institute of Education Sciences. Bachelor’s degrees conferred by degree-granting institutions, by field of study. Washington, 2011. Web. <http://nces.ed.gov/programs/digest/d11/tables/dt11_286.asp>.
Who’s Afraid of the Power
(cc) Flickr Photo by Emersunn
I just learned about “learning analytics” from Audry Waters, a blogger/journalist whom I am reading with increasing regularity. Reporting on the recent Learning Analytics and Knowledge conference in Vancouver, Waters shared a phrase that was used often at the conference, “data exhaust.”
The first time I heard digital data described as exhaust, was by Dave Sifry, the founder of the blog search engine, Technorati. He said something to the effect of, “The blogosphere is the exhaust of the human attention stream.” This was pre-Twitter and pre-Facebook, but it was a notion that intrigued me. I continue to use it in some of my presentations – that we, through our varied and seemingly unceasing networked interactions, are creating an enormous and at least partly useful reservoir of content.
But I wouldn’t call what we might do with that reservoir, “Learning Analytics.”
What appears to be coming from the conversation around this “new discipline,” as it is apparently called, has more to do with learning management than it does with learning empowerment – and that, in the right context, is not wholly unappealing to me. The ability to collect the artifacts of ones own digital trails, visualize and analyze what we’ve learned, how we learned it, and what we’ve learned to do with it might represent a personal enticement to broaden, enrich, and more purposefully direct our own digital trails.
Yet, like with so many things, we must ask ourselves, “What might happen if this wondrous new tool were to fall into the hands of evil?”
A couple of days ago, I posted on Facebook a reference to ALEC, or the American Legislative Exchange Council. They craft legislation of a specific philosophical leaning and get legislators elected who will pass such legislation under the guise of knee-jerk social issues, patriotic symbolism, and apple pie.
ALEC, which was formed in 1973, operated largely unnoticed until it was revealed that they had penned Florida’s “Stand Your Ground” legislation, resulting in George Zimmerman’s shooting and killing of an innocent teenager, Trayvon Martin, simply because Zimmerman “felt threatened.”
..explosion of legislation advancing privatization of public schools and stripping teachers of job protections and collective bargaining rights..
..is the work of ALEC. There are so many other examples of short-sighted attacks on public education and the intellectual freedom of teachers (see “Who’s Killing Philly Public Schools?“) that I have grown fearful for our future and more than a little resentful that the learner-empowering tools that I have promoted for 30 years seem to be enabling those who would rather use them to “manage learning.”
When I originally sat down to write this blog post, I had in mind a list of reasons why marketplace education is so potentially destructive. However, after including so many words into this writing already, I have come to believe that the issue is simple. Learning, like breathing, is human. It’s what we do and it is what has made us what we are today. We breath, we observe, we think, and we learn.
Learning can’t be installed in assembly line fashion, with quality control at the end of each season. It must be nurtured by a compassionate society and by caring individuals.
Privatizing public education would be as inhuman as it would be to sell the air – though there are some (ALEC Education Taskforce July 2011) who might like to.
As for Learning Analytics? It fascinates me, because I believe that there are potent skills we might develop and share, for learning important lessons from the digital trails of a billion people.
But the power is not in “learning analytics.”
The power is in ANALYTICAL LEARNING.
So who’s afraid of the power?
Flickr Photo by Jukebox909 ((Jukebox909, . “Polls show distrust of public opinion.”Flickr. N.p., 16 Nov 2006. Web. 12 May 2010. <http://bit.ly/7Xouzr>.))
I have long felt that the greatest value of the social web is in the content that it generates. I suspect that the content’s value compared to the value of “nearly now” ((A term coined by Stephen Heppell)) social idea sharing depends on the person. I’m not a chatter. I procrastinate phone calls. But I love to mine the conversation for ideas, knowledge, and resources that I need right now.
An interesting example of this comes from a Carnegie Mellon University study (pdf) indicating that analyzing data from Twitter posts can yield the same results as conducting a public opinion poll, perhaps costing less and irritating far fewer people.
According to the Mashable blog post I learned this from,
A CMU team from the computer science department looked at sentiments expressed in a billion Twitter messages between 2008 and 2009. The researchers then use simple text analysis methods to filter out updates about the economy and politics and determine if the overall sentiment of the update was positive or negative. The CMU team found that people’s attitudes on consumer confidence and presidential job approval were similar to the results generated by well-reputed, telephone-conducted public opinion polls, such as those conducted by Reuters, Gallup and pollster.com. ((O’Dell, Jolie. “Could Twitter Data Replace Opinion Polls?.”Mashable. 11 May 2010. Web. 12 May 2010. <http://bit.ly/cZa2y8>.))
CMU Assistant Professor Noah Smith thinks that for at least some topics, this kind of passive information gathering could work. Mashable blogger Jolie O’Dell quotes Smith as saying, “With seven million or more messages being tweeted each day, this data stream potentially allows us to take the temperature of the population very quickly.”
Twitter data tends to be noisy, as any tweeter out there knows. But so too is even the most carefully polled data. Researchers learn to filter out the noise, the extraneous data, and round out the results to reveal trends and indicators.
Twitter, as a source for opinion trends, certainly isn’t going to work for just any topic, and the data collected via Twitter tends to fluxuate more on a daily basis than does formally polled data, as discovered by the study. But I often make the point that we will continue to need to refer to authoritative, scientific, and formally vetted information to solve many of our problems. But in a time of rapid change, we need to also develop the skills to cull out timely, experiencial, and community shapped information to answer some of our brand new questions and solve some of our brand new problems.
Added Later: From this, one might say, with an increasingly conversational and participatory web, who needs public opinion polls? Certainly the issues involved are far more complex than that. But I can’t help but wonder if teaching and learning might come to take place in a more networked, digital, and info-abundant environment, and we might continue to develop data mining capabilities, if we might reach the point where the obvious question would be, “Who needs tests?”
Flickr Image (LHC Tunnel) by Mario Alemi
I’m on another of those wonderful stretches at home catching up with family, trying to catch some movies and mostly spending every spare minute trying to get as much office work done as I can (some writing and work on Citation Machine) with no time to read and blog.
But this post by David Wiley at Iterating Toward Openness was one of those sneaky reads that tricked me into wonder if I’m actually wrong about something. In The LHC and Education, Wiley started with his interest in the Large Hadron Collider. To say that the LHC was incredibly expensive is a sad understatement, and the machine does little more than generate data. But with that data, scientists will map realms of the universe that most of us can’t even imagine, much less see.
Whiley loves data. I love data. But he switched contexts, lamenting that…
The data that we, educators, gather and utilize is all but garbage. What passes for data for practicing educators? An aggregate score in a column in a gradebook. A massive, course-grained rolling up of dozens or hundreds of items into a single, collapsed, almost meaningless score. “Test 2: 87.”
It’s one of the reasons that “data driven decision making” doesn’t make my heart flutter the way that it does for others. It’s that, even in the best of situations, the data is scarce, shallow, grainy, and awfully expensive to collect — not to mention that the only people who can make much use of it are the data dudes that school sytems have been hiring over the past few years.
Then he totally chaffed my soul by suggesting (and rightly so from some points of view) that, “..using technology to deliver content is not improving the effectiveness of education…” but that another way of using tech might. Whiley continues,
I believe there is (another eay). I believe it so strongly that for the first time in several years I am opening a new line of research. I believe (and I fully admit that it is only a belief at this point) that using technology to capture, manage, and visualize educational data in support of teacher decision making has the potential to vastly improve the effectiveness of education.
I believe that I have written recently (Where Obama is Getting Education “Wrong”) that I think we should be teaching students to capture, manage, and visualize data as a basic working skill. It seems to me that ushering it away to the central office to worry over data as an educational concern may actually be detrimental to the learning our students need to be engaged in. Limited resources will cause us put undue emphasis on what can be easily measured at the expense of those important skills and knowledge that can’t.
But Whiley compellingly inspired in me a willingness to reconsider and I found my problem. It isn’t that I object to using data to inform better instructional decisions. It’s that the data is so lousy — ..scarce, shallow, graining, and awfully expensive to collect.
What if all of our students were doing all of their content and content processing digitally. What if all of the information transactions of learning, besides the most appropriately open conversations, was done with abundant, networked, and digital content. That would be an enormously dense, rich, and seductively meaningful mass of data that could be analyzed and visualized in a wild variety of ways. I’d be happy with that — especially if students became partners with us as self-analyzers and self-assessors, mastering their own skills as information artisans.
Powered by ScribeFire.
The reason that I can not get eight hours of sleep is that I am haunted. I become possessed by conversations I’ve had in my waking hours. I am drawn from my sleep by cold boney fingers reaching out from the graves of past presentations — by the insightful, but initially unrealized, comments made by participants and those questions that I wish I’d answered better.
Yesterday, at Saint Marys, a private girls school here in Raleigh, I was asked, “Although I agree with your call to better prepare our children for the future with more authentic assignments, does that help us in our mission to prepare girls for college?” Then the librarian asked, (and these questions are grossly paraphrased) “I know that Wikipedia and Google are invaluable tools — but what is the place for the online databases that we subscribe to?”
The ghost of workshops past that I must exorcise right now, was something that the Dean of Faculty said to me after the presentation. He related a conversation he’d just had with a math teacher of many decades who told him that she started to “get it,” when the presenter suggested that we need to be asking ourselves,
What kind of questions will we ask on our tests, when our students walk into the classroom with Google in their pockets?
And then he (I) asked the audience to consider calculators — how, for years, we resisted the new devices because it wasn’t math. It didn’t look like the math instruction we traditionally provided, and so we almost demonized the things. But now that calculators have become a critical part of many mathematics classes, have they changed the questions we ask? Have they changed the problems we ask our students to solve? Has it changed the nature of math instruction?
The answer, of course, is, “Yes!” Calculators empower learners to work numbers to an end. They force students to transend paper and pencil, to truly utilize the language of numbers to solve problems, answer questions, accomplish goals — to learn new things. I maintain that we should expect learning in the classroom to be the same as learning in the “real world” — that it is about ubiquitous access to the global flow of information and the tools that empower us to work that information.
It’s where the Obama Administration has it completely wrong. According to Secretary Arne Duncan’s July 24 Washington Post op-ed, “The president starts from the understanding that maintaining the status quo in our schools is unacceptable.” (( Duncan. Arne. “Education Reform’s Moon Shot,” The Washington Post 24 Jul 2009. Web.18 Aug 2009. <http://www.washingtonpost.com/wp-dyn/content/article/2009/07/23/AR2009072302634.html>. )) Yet, it appears to me that the status quo is exactly where we are staying. Like the former failed administration, the answer seems to be do the same thing, just do it more, do it harder, do it longer, and our children will gain the skills they will need to “compete in the global economy.” This is wrong on so many levels that I just want to throw up my hands give up.
So back to my haunt. What interests me about the connection made by the math teacher between the calculator and the Google’d cell phone is that they are both about empower learning. Of the four (entirely unoriginal) education reform areas (see left) being targeted by the administrations dangled carrot ($4.3 billion), the one that irks me the most is number three — data.
Now I love data. I love what you can do with data. Data visualization is one of my favorite themes to follow on Twitter. But what’s wrong with “Building data systems that measure student success and inform teachers and principals..” and wrong with so much of the prevailing conversations about education reform, is that it’s about empowering teaching and schooling. It’s designed to help us do our jobs better as educators — when we need to be figuring out how to empower our students to do their jobs better as learners.
Obama, through Duncan, wants us to use data to measure student learning — and by result, to further limit what we teach to that which can be measured. What we should be doing is helping our students to use data, so that they can measure their world and better understand their relationship with that world — what can’t be measured.
The bottom line, in my opinion, is that we are continuing down the same dumb path of thinking that we need high school and college graduates who know the answers to old questions. This is wrong!
It’s new questions that will define our future. Today, we need graduates who can invent answers to the “new questions.”
Powered by ScribeFire.