This post comes after much thought, research and deliberation. It also comes after a barrage of articles in periodicals and newsletters as diverse as TIME magazine and the AERO (Alternative Education Resource Organization) on-line newsletter. Everyone (or so it seems) has a very strong opinion attached to the subject of technology and its proliferation in our everyday lives.
On one hand, I completely understand and agree with the argument that an over abundance of screen-time is inhibiting our kids from learning about and spending time in nature and they are losing their innate ability to entertain themselves. It also has neurological consequences when you consider being connected and “on” all the time. Our brains are not designed for constant external stimulation.
However, to blame technology as the cause for all society’s woes is completely unfair. Parents and grandparents have been complaining about the next generation and their habits since the beginning of time, or at least the last century. If you listen to the subtle messages from some, our kids (the Millennials) are disrespectful, disengaged, spoiled, unsociable brats. But if you go back 50 years, the Greatest Generation was using the same words to describe the Baby Boomers. And 30 years ago, those Boomers were pronouncing the same for all the Gen X'ers. It is amusing to realize that computers, cell phones, and all the other gadgets didn't exist on a personal use level until just a few short years ago. And in the words of the awesome Neil DeGrasse Tyson, “Kids are never the problem.”
I firmly believe, as with other parts of our lives, moderation is the key. Consume too much of anything, whether it is a “good” or “bad,” and you are bound to feel the ill effects.
We use technology as a tool to learn or to grow a business, to write or to play games, to research or to communicate. It is part of our everyday world and makes our lives easier. We can't refuse to use or even legislate technology just because some may use it to excess.
The ease with which people can access information, communicate, create art or literature, or entertain themselves through games or video is the main reason self-directed learning is surging to the forefront of educational practices. Technology serves to enhance the learning experience, whether in the classroom, an internship, or a hands-on experience; it doesn't negate the importance of skilled instructors, mentors and facilitators.
In high school, I suffered through a typing class that I actually failed, or maybe I just quit (I honestly don't remember). It was almost as miserable experience as the French classes I endured in 8th grade and then again as an adult in college, or the remedial grammar “class” I tolerated during my first year in college (when I did worksheets listening to the lessons through headphones in the college library basement).
What is the point of boring you with the educational failures of my life? Well, I did learn to type, sort of, on a computer. My word count is still abysmal, as is my spelling, but with a PC, I can delete, backspace, control “X,” “C,” or “V,” and go to spellcheck, dictionary, or thesaurus.com to my heart’s content. I am the only one who really cares how fast I am typing and whether my fingers are on the “home keys” or not. I have managed to write two chapter books for kids (one is self-published the other languishes in my documents folder), research papers, and multiple blog posts. As with everything else, I am constantly practicing and improving this skill. Plus, I am ever so grateful to an obliging friend who is willing to spend her time reading and editing everything I create before it goes out to all of you.
I won't mention the language thing except to say that some of us have a proclivity, and some of us don't. I am sure if it was important enough to me, I would gladly and joyfully tackle another language. Just to be clear, I have never had to or wanted to.
When I went back to school (SUNY Potsdam) in 2001 (only 13 years ago), I had to use microfiche and periodical indexes to source any material I needed to write a research paper. When I think about the amount of time I spent finding articles to compare the prevalence of osteoarthritis in ancient peoples to modern populations for my final paper in Osteology, I want to sit down and cry. On the upside, MacKenzie, at the age of 4, had a great time feeding nickels into the microfiche reader. Yes, even as a preschooler she entertained herself fairly easily. No need to wonder why she has become a successful independent learner.
Today my children or students can access everything they want to know by typing the subject or question into Google or any of the other programs colleges subscribe to specifically for academic research. The answer to anything they could possibly wonder about is right there in front of them in moments.
Not only do they receive the answers to their questions instantly, they also learn to question those answers. And that is what gets to the very heart of self-directed learning. The ability to decipher what constitutes a correct answer from an incorrect one, understanding that there are bad answers, and that it is okay to question what appears to be an authoritative source. This is the knowledge of learning how to learn: seeking, questioning, a willingness to explore the subject with someone else, and in the end trusting their own judgment and intuition.
We humans have always had that capacity, but technology has made it possible for everyone to pursue anything they want to explore in a moment’s notice. Young or young at heart, we aren't done learning till the day we leave this earth. Okay, playing games, watching movies, or stupid videos (cat, etc.), browsing Facebook, and listening to music is fun and important, too.
Don't miss a post!
Sign-up here to get the DRC Blog delivered to your inbox.