Determining how much “screen time” their children should have access to is a top concern for parents grappling with the unknown territory of raising children in a technology driven world. The answer, however, is not quite clear.
By its definition, psychology is a social science that relies on research to support or disconfirm hypotheses. Psychologists guide their clinical practice by examining the most current research about a given topic. When it comes to screen time, evidence surrounding its impact on child development is just beginning to emerge. Those born after the year 2000, otherwise known as Generation Z, recently entered adolescence. This postmillennial generation is the first to be born and raised within an environment with various types of electronic devices at their fingertips. First, let’s define what constitutes screen time.
According to the Oxford Dictionary, screen time is “time spent using a device such as a computer, phone, television, or gaming console.” Activities that individuals used to accomplish twenty years ago by physically running errands (i.e., depositing a check, mailing a gift at the post office, picking up Friday night take out) have now been replaced by apps or automated services. One can accomplish activities that may have taken a good portion of the day in about 10 minutes from the comfort of one’s couch (or office chair).
While there is no doubt that technology has enhanced our lives, there continues to be ambiguity surrounding the impact it is having on younger generations. It is this ambiguity that is causing parents to worry about their children’s formative years of cognitive development coinciding with society’s increasing dependence on technology. Stay tuned (no pun intended); in my next blog post, I will discuss the impact of video games on children’s cognitive development. It might surprise you…