December 12, 2014

The Axiom of Awesomeness

Everything is a skill.
Is intelligence something we are born with? Or, can we become more intelligent with hard work?  The answer is certainly a bit of both but a critical variable is our personal opinion on the matter: if you don't think you can get smarter, you won't.

In classroom settings, if students think intelligence is malleable, they are more motivated, exert greater effort, and outperform students who think intelligence is fixed trait (I'm not compensated for that link, BTW).

Unfortunately, I think the broad implications of these observations are not always appreciated.  Often, the message is that a growth mindset - an attitude that self-improvement is possible - is critical for performance by kids and in school.  However, a growth mindset is really about performance by anyone doing anything.  The real take-home message from studies like these is that a general growth mindset is a critical first step before we can become better at life.

To counter this poor messaging, I propose the following Axiom of Awesomeness.  The Axiom outlines a philosophy of growth that we must accept before we can improve.  I'll enumerate the axiom first and then will expand on each tenet.

The Axiom of Awesomeness

1. Everything we do is a skill.  
2. Every skill can be improved by deliberate practice.
3. Skill improvement through deliberate practice takes time and effort.
4. Every moment is an opportunity to practice a life skill.

OK, so that's the Axiom. Hopefully I will be able to convince you that the axiom is on-target and that it's an important cognitive framework to support becoming a rockstar.

1. Everything we do is a skill.  

While most research studying a growth mindset were focused on school performance, school performance is only one part of life.  As I've discussed before, being good at life goes way beyond an SAT score.  However, everything else we do can be considered as much a skill as test taking.  For example, staying focused on a task is a skill.  Controlling our emotions during stress is a skill.  Managing relationships is a skill.  Speaking is a skill.  Writing is a skill (one I'm struggling with right now).   Skills aren't only limited to those things we traditionally might consider skills, like playing the guitar, drawing, or taking tests.

2. Every skill can be improved by deliberate practice.

No one is born playing the guitar or knowing math or being a great public speaker or being a leader.  Even the masters in fields like these spent countless hours focused on getting better: they practiced (for an in depth analysis of this, check out Mastery by Robert Greene).  Dedicated, focused practice is essential for improvement and success.  For more on the focused part of this tenet, see this article by Cal Newport.

However, let's keep in mind the first tenet of the Axiom: everything we do is a skill.  For example, let's say I'm bad at following through on projects (something most people struggle with, I'm sure).  Well, following-through-on-projects is a skill.  That's the first tenet of the Axiom.  The second tenet of the axiom is that practice improves skills.  Thus, I need to come up with some way to practice this skill.  In my example of "following through on projects", I might select one, small project as a "fail at no cost" test-case which will allow me to work on my issue without getting discouraged.  Once I succeed at the test-case, I can set my sights on something more challenging.

3. Skill improvement through practice takes time and effort.

Ahh, but here is the tricky part: getting better takes real work!  Sorry, I know we all want someone to tell us the secret to being amazing (like this book claims).  But that's not reality and we all know it.  Improvement takes focused effort over long periods of time.  Until one has practiced something for decades one can't assume it's an impossible task.  My homunculus likes to say to me: "Stop whining like a little baby, put your big boy pants on, and get to work".

4. Every moment is an opportunity to practice a life skill.

The good news is that getting smarter, more intelligent, or better at life can happen at any moment.  We don't need to concoct some arbitrary self-improvement program that goes on our calendar.  Every moment of living is a practice opportunity.  Why?  Because life isn't easy and we aren't perfect which is a guarantee that we will always screw something up.  With the Axiom of Awesomeness in mind, these challenges morph from "this sucks and I suck" to chances to get better at life.

Here is a perfect example: preparing for a presentation.  Making and giving a presentation is an opportunity to resist the urge to procrastinate (a skill), focus on a single task (a skill), manage our anxiety (a skill), communicate to a group (a skill), and bounce back if it doesn't go well (a skill).  Every step of preparing for this presentation is a practice opportunity.  My favorites in the list are: managing our anxiety and bouncing back if it doesn't go well.  These are skills that most often trip us up because they are so hard to define and aren't viewed as skills at all.  Frequently, these skills are chalked up to "that's just the way I am."  Wrong.  We must remember the first tenet: everything is a skill.  Next time can be better but we have to keep practicing.

In Summary

Perhaps this is obvious, but a belief that improvement is possible will determine if we improve.  However, a growth mindset is only part of the equation.  As I argue with the above Axiom, a growth mindset must be supplemented by three additional elements: 1) a broad definition of skill to include anything humans do, 2) a willingness to put focused effort into improvement, and 3) an attitude that life's challenges are opportunities to get better at living.  In combination, these four assumptions will not only permit improvement but will also be motivating during periods of difficulty when our resolve is tested.

Stay happy!

November 29, 2014

Intelligence in the information age

I know kung fu.
"I know kung fu." And with that, the download is complete: a martial art has been converted to tacit knowledge with little more than a USB cord connected to the cerebellum. Although a fanciful concept portrayed in the movie The Matrix and one that, as depicted, is far from reality, the information age is certainly changing the landscape of knowledge and expertise.  While an instant download to the brain isn't possible, anyone with a smart phone or cheap laptop has instant access to all of humanity's knowledge.

What are the implications of this access for our concepts of intelligence and education? For one, the goal of education can no longer be viewed as acquiring information.  Information is free and easily obtained.  Instead, from my humble perspective, the speed at which one can process this information is one critical skill needed.  A second is an ability to distill larger patterns from the information available.

The Matrix offers additional useful analogies here.  The protagonist, Neo, quickly learns that his enemies have access to all the same knowledge as he does: everyone knows kung fu.  It's only when Neo transcends this knowledge and begins to manipulate the matrix itself that he is able to conquer his enemies.  In other words, he goes meta.  Instead of mastering specific knowledge, Neo identifies that nature of knowledge itself and can manipulate it as he needs.

The same may be said of us in the information age.  Everyone knows kung fu because we all have access to all the same information.  In this situation the ability to process that information at a higher level becomes essential.  Meta-knowledge skills will separate the effective from the ineffective.  Examples of these skills include finding, filtering, creating, and communicating knowledge.  When everyone has access to the same facts, effectiveness will be measured at this higher level of abstraction.

The Matrix has another sobering lesson to offer us in this time of the information age.  Even after Neo masters the matrix, transcending it, the Matrix isn't the real world.  The Matrix is a construct that clouds the mind of humanity, preventing access to reality.  The same could be said for the internet and information technology.  At the end of the day, knowledge workers and information technology must create real world value.  People must eat and sleep, be sheltered and clothed.  Information can do none of those things but can enable them if applied well.  Like Neo, none of us can be considered effective (and by extension, intelligent) if we are unable to apply the information available to us to problems in the real world.  In this way, applying information to reality is the ultimate transcendent skill and, simultaneously, the hardest to master.

February 12, 2014

Resilience commercialized

While watching the Winter Olympics, I was pleasantly surprised by this commercial from Proctor and Gamble.  You can view it in-line below too:


Although I don't know how successful the commercial will be for P&G (I couldn't name a P&G product and the commercial doesn't describe any), I think the commercial is an incredible public service announcement.   By promoting resilience, grit, determination, P&G may be really helping people, as is summarized in this excellent New York Times piece on resilience by the appropriately named Paul Tough.

February 06, 2014

My Child's Brain

My brain is growing.
I recently reread the excellent book Welcome to Your Child's Brain by Sam Wang and Sandra Aamodt.  This book is an impressive compendium of information and evidence-based tips about child brain development.  As I watch my son grow up (now 16 months old), it's fascinating to read about the underlying processes that make it all possible.  It's also nice to hear what science has to say about child development, which mostly had the effect of reducing anxiety.

Also, I'm reminded about how rare it is to find a resource for parents that is based on science and evidence rather than the opinion of some random "expert".  My wife and I experienced this first hand when friends suggested that we read On Becoming Baby Wise for information about how to get our son to sleep through the night, which we started to implement until we learned that the lead author didn't hold a college degree and had never conducted a study of the effects of the proposed program.  Of course, it's impractical to wait to try anything until a study has been done to validate but if evidence does exist then that should be the place to start.

My one critique of Welcome to Your Child's Brain is that it is a bit technical.  I was fine with that but I spent 6 years of my life reading neuro-jargon - for the uninitiated the book is weighed down by terminology.  

In the end, parents need more resources like Welcome to Your Child's Brain that are based on facts, not speculation.  As far as I know, no organizations exist that serves as watch dogs for bad parenting advice... Perhaps, we need one?  What do you think?

December 16, 2013

Big, beautiful, bouncing... Brains.

Bounce monkey brain, bounce!
As I mentioned last time, I've become intrigued by the relationship between brain function and diet: does the brain run best on a particular set of fuels?  Does peak mental performance require some specific type of foods?  Or, can we stuff our faces with garbage and expect to have tip-top noggins?

I've started my research by examining the evolutionary context for our big brains.  Specifically, what type of dietary environment supported the evolution of our minds?  Can we learn anything from examining the paleontological record of our ancestors?

I've learned a few interesting things so far, all related to the coincident timing of certain events during the rapid encephalization, or brain growth, that culminated with the human brain.  The first that jumps out is the earliest evidence of stone tools, which roughly coincides with the appearance of the species Homo habilis around 2.3 million years ago (mya).  At the point that tool use became evident, the brains of H. habilis were roughly "the same size as that of a chimpanzee."

It was over the next ~1 million years that brain growth was most dramatic, where the 600 cc Homo brain case nearly doubled in size to ~1,100 cc in H. erectus.  Although the exact timing is still controversial, H. erectus was the first species to demonstrate the controlled use of fire, with evidence taking the form of obvious ovens by around 200 thousand years ago (kya) although some argue that cooking existed as a technology for much longer.  If we assume cooking really took off at the more recent end of the possible range, this very closely aligns with the emergence of anatomically modern humans - the oldest fossilized remains that closely resemble modern humans - and represents the other, major coincident event that seems to correspond with a major advance in the human lineage.

These examples of critical cultural advances - tool use and control of fire - both indicate that our ancestors relied heavily on hunted or scavenged animals as a source of nutrient-dense food as animal bones marred by both tool marks and burn marks have been observed.  These facts, combined with analyses of both current and inferred hunter-gatherer diets, suggest that the eating of animals was an important factor in the development of the human brain (sorry, vegetarians).

That being said, the evidence also seems to suggest an important role for the consumption of significant plant-based food in our ancestors' diets as well: fruit, tubers, nuts, and vegetation.  Supporting an energy-hungry organ like the human brain takes a lot of raw material and energy, so it's reasonable to assume that our ancestor's ate anything they could hunt... or gather.  However, the caveat to applying this "eat what you can" or omnivore approach to our modern diets is the relatively recent emergence of agriculture on the scene ~10 kya.  The explosive growth in the availability of carbohydrates allowed by agriculture is the cause, hypothesized by Gary Taubes and others, of many of the modern maladies of civilization, including obesity, diabetes, heart disease, and cancer.  For the most part, I buy the argument that too many carbohydrates is likely driving the rapid rise in obesity and related comorbidities, but that is for another day.

Regarding brain function, the open question for me is this: if our brains are hungry organs and we evolved to eat anything we can, shouldn't more of everything (including carbohydrates) be better for peak brain function?  Isn't it possible that the best diet for brain function may not be the best diet for long term health?  I don't feel confident with the range of possible answers to this question, but my rough analysis of our brain evolution would seem to suggest that an emphasis on animal protein and the "gatherable" fruits, nuts, and vegetables may represent a sweet spot for brain function as well as overall health.  In future posts I hope to explore the mechanisms of brain metabolism in the context of different levels of nutrients with the hope to learn more about how the brain prefers to get it's fuel.  Perhaps the hunter-gather diet is best of brains, but I will wait for more evidence before I make the call.