Cloaking Device Alters Appearance Instead of Hiding an Object Altogether

cloaking_0Metamaterials are meta-awesome. They look deceptively simple, and yet they’re probably the building blocks of future Star Trek technologies come to life.

To the untrained eye, they just look like a sheet of grids or squares of geometrically repeating copper wires. Beautiful, to be sure…but technologically brilliant?

Nah.

Except that they are. These rows and columns of repeating copper configurations are tediously mapped and built to interact with electromagnetic waves in a very particular way.

Let’s say you wanted to build a magnifying sphere that gave you a close-up view of anything placed in its hollow center. First you start with six magnifying glasses configured like the walls of a cube. That might work, except if you were looking at any of the walls from an angle. So you make the magnifying glasses smaller and smaller and put more and more of them together until you get something that looks like a disco ball.

Well, now the magnifying glasses are probably too small to use. Practically speaking, this would never work. But if you can theoretically grasp this concept, then you can start to grasp metamaterials.

The major difference is that glass can focus light, while metamaterials can (theoretically) manipulate any electromagnetic wave on the spectrum in a variety of unnatural ways.

A rendering of the first cloaking device

A rendering of the first cloaking device

Take, for example, the first cloaking device ever built. It has several layers of concentric metamaterial rings that are scaled and aligned to interact with microwaves. Instead of having them bounce off in random directions, the material carefully directs them around the circle, joining them back together on the other side. So to an observer sitting opposite the microwave source, it appears as though nothing is in the waves’ way.

While very cool, this first attempt at cloaking had a ton of limitations. It only worked in on the plane of the circles—look down from the top or from a 45-degree angle and there’s no effect. And since no microwaves can get to the interior of the shield while being demonstrated properly, that also means whatever is inside can’t see or talk to the outside world using microwaves.

But this was almost a decade ago. In the years since, a lot of research teams across the nation have made a lot of pretty neat advances, and the most recent comes from the lab of Douglas Werner at Penn State.

Instead of creating a rigid 3D structure that completely shields its interior, Werner and postdoctoral fellow Zhi hao Jiang have created a flexible printed sheet that can wrap around any 2D object and create the illusion that it is made of a different material. For example, a typical radio antenna made of a conducting metal could be made to appear that it’s actually a rod of dielectric material like silicon or Teflon.

Why does that matter?

Well, if you’re trying to hide an object from an enemy spy, that probably won’t help you too much. But if you’re trying to hide an antenna from surrounding radio wave interference while still allowing it to function properly, then you’ve hit the jackpot.

Don’t expect to see it on the shelves or in the real world anytime soon, though. The new coating only works for a 20-degree field of view, which might not be all that helpful when there’s electromagnetic interference all around us every day. But it’s a nice step in the right direction, and the field is still less than a decade old.

The study, “Quasi-Three-Dimensional Angle-Tolerant Electromagnetic Illusion Using Ultrathin Metasurface Coatings,” was published in Advanced Functional Materials by Jiang and Werner of Penn State University.

Posted in Penn State | Leave a comment

Playing Video Games a Lot Linked to Healthier Young Adults…or Not

Addicted_63661c_244008I came across this headline recently from the University of Illinois, “Teen gaming addicts may wind up physically healthier as young adults, study says.”

That’s a big head-scratcher. I mean, really? You’d think that the more a teen is plopped down in front of their gaming console the less energy they’d expend and the less likely they’d get physical exercise in the future. Nonetheless, there it was.

The claims come down to three main results. Five years after the start of the study, heavy gamers are less likely to smoke marijuana, less likely to have a high BMI, and more likely to suffer from depression.

Umm… those don’t really go together.

Looking closer at the study, there’s a lot of problems to be found with the statistics. First off, while the release touts that the study examined more than 10,800 youth in the United States, the actual number of data points for teens playing a lot of games was rather low. The researchers split them into three groups: 21+ hours per week (136 teens), 35+ hours per week (49 teens), and 42 hours per week (27 teens).

Only 136 data points is not a lot to be able to draw many conclusions from, let alone 27.

Next, we can dig down a bit more into the actual statistical analysis. First off, a p value is the likelihood of a researcher finding the same results due to random fluctuations rather than an actual effect. The closer a value is to 0, the more likely there’s an actual effect there. Typically, a p value less than .05 is considered to be decently suggestive of a real affect.

So in this study, we have the following p values:

Marijuana           21=0.019              35<0.001              42<0.001
Lower BMI          21<0.000              35<0.001             42<0.001
Depression         21=0.342              35=0.703              42<0.001

So while it looks like the results regarding marijuana and lower BMI might be a real affect, for some reason the depression rate only becomes unlikely to be a statistical anomaly once fewer data points are taken into account. Okay, so maybe there isn’t a linear relationship and only those playing 42 hours or more per week show any type of effect.

Let’s take a closer look at the effect sizes. After all, even if something does definitely affect a person, if it only changes things a tiny amount, who cares? In statistics, an effect size close to 1 or -1 is huge while one close to 0 is tiny. In general, you need to get above a 0.50 to consider it a large effect.

Marijuana           21=-0.129            35=-0.591            42=-0.468
Lower BMI          21=-0.251            35=-0.496            42=-0.194
Depression         21=0.052              35=0.703              42=0.731

So let’s tackle these one at a time.

For marijuana users, while the effect of gamers smoking less always seem to be genuine, it only becomes large for those playing more than 35 hours of games per week, and actually decreases once they start playing more than 42. So that extra hour a day means you suddenly are more likely to light up a joint.

Doesn’t make much sense.

For the lower BMI score, there is again a high chance the effect is real, but it only comes anywhere close to being a large effect after 35 hours of game play and all but disappears at all other levels. So once again, if you play 35 hours of games per week, something about that correlates to a lower BMI, but add just an hour a day or subtract two hours a day and the effect size diminishes greatly.

Doesn’t make much sense.

For depression, there is absolutely no correlation between playing a lot of games and becoming depressed for those playing 21 or 35 hours per week. But once you get up to six hours or more per day, there’s suddenly a very good chance that there’s a real effect and that it’s a big one.

That does make sense.

You might expect teens shut in their room for more than one-third of their day to develop depression. Whether the gaming is a cause or a symptom is, of course, up for debate. But typically people need real human interactions in a face-to-face manner to stave off mental health issues.

With only 27 teens represented, however, this study is by no means definitive. What it means is that, hey, there might actually be something to this, maybe some other folks should look into this more by designing a study to look at this specific question.

And seeing as how the data was taken from participants who were, on average, 15 in 1994 (making them 35 today), I’m guessing they could easily find enough participants in today’s youth who spend more than 42 hours per week playing games to conduct a study.

As for the university’s news office making claims about lower marijuana use and BMIs for teens that game a lot, I don’t buy it. I do, however, buy their use of the term “mouse potato.” That’s clever. But it also dates the study.

Who uses a mouse to play games anymore?

The study, “Long Term Effects of Video and Computer Gaming Heavy Use on Health, Mental Health and Education Outcomes Among Adolescents in the U.S,” is the thesis of Chennan Liu of the University of Illinois, who is now a professor of social work at Renmin University of China, and will be presented at a conference of the Society for Social Work and Research early next year.

 

Posted in Illinois | Leave a comment

Competition Produces Giant(er) Noggins

big-headed-antI have a large head. When I first started playing peewee football, they had to grab me a helmet from the varsity team. Once I got to junior high, they had to run over to the high school to find my head some protection. Hats don’t usually fit.

I look like an orange on a toothpick.

But I’ve got nothing on the big-headed ant, which, yes, is an actual species of ant. You’ve probably seen one as they live just about everywhere on the globe. Since their giant, Schwarzenegger-like noggins give their pincers incredible strength, they tend to take over new communities when transplanted by human activity.

That’s right, they’re an invasive species bent on world domination through their pincers.

Since the big-headed ant lives in so many regions in the world, researchers from the University of Illinois thought it might be an interesting opportunity to study how local conditions affect ant populations. More specifically, they thought that big-headed soldier ants might be larger or more numerous in areas with dangerous foes.

For example, Australia has numerous species of fierce and competitive ants that while Hawaii has zilch. And because workers and soldiers are developed through feeding versus genetics, as are their counterparts in bee colonies and other social insects, the community has direct control over their growth.

So perhaps the colony might start feeding their offspring more “soldier jelly” earlier in their development if there’s a lot of danger around. Or maybe they’ll turn more larvae into soldiers than workers.

As it turns out, the former is the way they roll.

antmapAlthough colonies in Australia and other areas of high competition don’t create more soldiers than their cousins living the easy life in Hawaii, they do make bigger soldiers. On average, the researchers found that soldier ants in Australia are three times bigger than those in Hawaii.

They also looked at three sites with intermediate levels of competition in Florida, Mauritius, and South Africa. And sure enough, the size of the soldiers all fell within the two extremes found in Australia and Hawaii.

The researchers hypothesize that the ants use chemical signals called pheromones to determine when the nest has the right proportion of soldiers to other workers. If this turns out to be the case, scientists may be able to control the ants with chemical signals without disrupting the surrounding ecology.

The study, “Body size variation and caste ratios in geographically distinct populations of the invasive big-headed ant, Pheidole megacephala (Hymenoptera: Formicidae),” was published in the Biological Journal of the Linnean Society by University of Illinois entomology professor and animal biology department head Andrew Suarez, along with postdoctoral researcher Bill Wills.

Posted in Illinois | Leave a comment

Messing with Bacterial Roll Calls

downloadIn politics, quorum is the number of people you need present to vote on legislation. It’s that minimum target you avoid by having politicians leave the state to stop ridiculous policies from being rammed through the system. It’s the number you aim for when bringing unwilling participants into the room kicking and screaming to further your own agenda.

In the microbial world, quorum is also an important number. It determines whether a colony of bacteria lay dormant to avoid the immune system or rouse the troops to begin an infection. It can change a benign group of microbes into a deadly biofilm. Or it can activate certain processes that are beneficial to the digestive system.

In politics, quorum is measured by taking roll call; by having those members present vocally indicate their proximity. In the microbial world, roll call is taken through chemistry. Bacteria secrete signaling chemicals that their brethren can detect. And once the concentration levels of those chemicals reach a certain tipping point, the bacteria start turning on or off different genes.

Helen Blackwell, a professor of chemistry at the University of Wisconsin, is messing with those signals. But rather than bringing in extra bacteria squirming and secreting, she’s blocking or imitating their “here” call.

A chemical signal works by a bacterium producing a protein that travels to its neighbor and binds in a specific receptor.Blackwell and her research group started changing key building blocks of the proteins one-by-one to find which were important to the signaling. They expected to simply block the signal, but instead, they found that they could completely flip the signal’s effect.

“It was surprising that making minor tweaks, very subtle changes, to the protein would convert a compound from an inhibitor to an activator, or turn an activator into an inhibitor,” said Blackwell. “That shows that small-molecule control of quorum sensing is very finely tuned, much more than we even expected.”

There are many potential applications to messing with bacterial communications signals, but perhaps the most interesting is medicine. As microbes evolve more defenses to our biggest and baddest antibiotics, it’s become more important to find other ways to thwart their infections.

Perhaps stopping their swarming signals is one way to go.

While current antibiotics are designed to kill microbes, the goal of quorum sensing would be to keep them “tame” and harmless, Blackwell says. “If these ‘on/off’ protein modifications are as important as we have found, they may help us design new compounds to inhibit quorum sensing and reduce the harm of bacterial infections, without causing the drug resistance that is producing so many problems today.”

The study, “Mutational Analysis of the Quorum-Sensing Receptor LasR Reveals Interactions that Govern Activation and Inhibition by Nonlactone Ligands,” was published by Blackwell with recent graduate students Joseph Gerdt and Christine McInnis.

Posted in Wisconsin | Leave a comment

What You Can’t Smell Might Kill You

Old-People-Smell-510Besides decreased mobility and ability to clean and bathe properly, there’s a good reason for that old person smell that everyone knows and hates—your sense of smell deteriorates as you get older. So even those who are blessed with a long-lasting schnoz simply might not be able to tell when their home starts reeking of cat urine.

Some people lose their sense of smell earlier than others, however, and certain pathologies can also lead to olfactory deterioration before its time. And according to a recent study from the University of Chicago, it can also predict death.

There are a lot of obvious physical abnormalities that can predict when a person is more likely to pass away in the coming years. Heart failure, cancer, and lung disease all come to mind; when one of those comes knocking, you can bet the Grim Reaper isn’t too far behind.

But believe it or not, losing your sense of smell should be much more troubling than any of those three.

“We think loss of the sense of smell is like the canary in the coal mine,” said the study’s lead author Jayant M. Pinto, an associate professor of surgery at the University of Chicago who specializes in the genetics and treatment of olfactory and sinus disease. “It doesn’t directly cause death, but it’s a harbinger—an early warning that something has gone badly wrong and that damage has been done. Our findings could provide a useful clinical test, a quick and inexpensive way to identify patients most at risk.”

The National Social Life, Health and Aging Project is a nationwide longitudinal study following a wide cross section of Americans ages 57 to 85.. Starting in 2005, the study tested 3,005 participants and followed up with them five years later.

In that first year, most everyone could correctly identify at least four of five smells presented to them. But 3.5 percent were considered “anosmic,” meaning they could identify just one of the five scents or even none.

And of those who had a broken schnoz, 39 percent of them died within the next five years.

When the researchers adjusted for demographic variables such as age, gender, socioeconomic status (as measured by education or assets), overall health and race, those with greater smell loss when first tested were substantially more likely to have died five years later. Even mild smell loss was associated with greater risk. Acute loss of smell was a better predictor of mortality than every other physical ailment recorded other than liver failure.

“Of all human senses,” Pinto said, “smell is the most undervalued and underappreciated—until it’s gone.”

Precisely how smell loss contributes to mortality is unclear. “Obviously, people don’t die just because their olfactory system is damaged,” said Martha K. McClintock, the David Lee Shillinglaw Distinguished Service Professor of Psychology.

Whatever the reasons, it’s a fascinating and unexpected result that leave you wondering why. Sounds like an excellent beginning to somebody’s PhD thesis.

The study, “Olfactory Dysfunction Predicts 5-year Mortality in Older Adults,” was published in the Proceedings of the National Academy of Sciences by McClintock, Pinto, Kristen E. Wroblewski, David W. Kern and L. Philip Schumm, all from UChicago. Linda Waite is the principal investigator of NSHAP, a transdisciplinary effort with experts in sociology, geriatrics, psychology, epidemiology, statistics, survey methodology, medicine and surgery collaborating to advance knowledge about aging.

Posted in Chicago | Leave a comment

Showing Upwardly Mobile Chinese Your Face

130517141050-china-tourists-hong-kong-camera-story-topIf you’re looking to make a few bucks in the stock markets in the next decade, a good recommendation might be hotels catering to the Chinese. And if you’re really smart, you’ll look for brands who understand face.

No, I don’t mean focusing on brands willing to hire Emma Stone or Brad Pitt as their spokespeople, I mean the Chinese cultural value that encourages showing prestige, wealth and pride. If you think Americans have a penchant for buying luxury goods for no other reason than that they’re recognized as luxurious, you don’t know Wang.

I’m pretty sure you’ve probably heard about the thriving Chinese economy. As their billions of people continue pumping new goods into the market and new emissions into the atmosphere, their culture is seeing the same rise of the middle class that America did 50 years ago. This means more Chinese with extra spending money to buy goods and travel.

And traveling to America means major face.

A recent study from the University of Illinois looked at how Chinese citizens look at three major U.S. hotel brands—Hilton, Holiday Inn and Super 8. Of the more the 600 people interviewed, only 10 percent of the participants had previously stayed at any of the three brands studied, although participants were aware of them and the numbers of completed questionnaires were distributed almost equally across the three brands.

As you might expect, the respondents saw Hilton and Holiday Inn as luxurious brands whereas Super 8…not so much. If they were just looking to stay somewhere in China, they didn’t mind saving some money at a Super 8. But if they were going to America, they suddenly became more interested in intangible features of the hotel, such as prestige and luxury.

And, in short, the study says US hotels need to recognize that and do a better job presenting “face” to potential Chinese tourists.

All of that is well and good, but let me share some more troubling numbers from the study.

In 2013, more than 1.8 million Chinese tourists visited the United States. That figure is expected to rise to more than 2.1 million this year and increase roughly 20 percent every year through 2018. For those of you without calculators, that equals 5.2 million Chinese tourists.

So I’m really glad I don’t live in LA, NYC, DC, Las Vegas or Niagara Falls. There’s only so many slow-moving tourists videotaping every second of their trip that I can take.

The study, “Modeling consumer-based brand equity for multinational hotel brands – When hosts become guests,” was published by Joy Huang, professor of recreation, sport and tourism at Illinois, and Liping Cai, professor of hospitality and tourism management at Purdue University.

Posted in Illinois, Purdue | Leave a comment

Shocking Memory Improvements

mind-control-TMS-viWant to improve your memory? Just zap the hell out of it with some magnetic fields!

According to a new study from Northwestern University, targeted electric shocks to the surface of the brain can improve memory in adults more than 24 hours later. All you need is an MRI machine to identify the right spot on your cranium and a Transcranial Magnetic Stimulation (TMS) device to shock the ever loving shit out of it.

Okay, so it’s not exactly electric shocks per se. If you’ll remember back to your high school physics classes, electricity and magnetism are two sides of the same coin. Electric currents cause magnetic fields and magnetic fields cause electric currents. In TMS, a small but powerful magnetic field is focused into the surface of your brain – completely bypassing your skull – causing electric currents to spread through your neurons.

The technique is not new; researchers have been using it for decades in studies to try to affect memory and behavior. Off the top of my head, I know at least one program that uses it in combination with executive function training sessions to try to help addicts get ahold of their fix needs.

This study was a little more elegant than most, however, as the researchers first used an MRI scan to identify the exact location they wanted to target on each participant. Even then, they couldn’t get directly to the spot they were after.

The primary region of the brain associated with memory is the hippocampus, which lies deep within the brain; much too deep to get to directly with TMS. So instead, the scientists targeted structures closer to the surface with a ton of wires leading directly to it. In short, they were aiming for a domino that would then knock over a bunch of other ones and reach the domino they really wanted to fall.

Each person received the TMS stimulation for 20 minutes each day for five days. After each round, they were shown a series of faces and given a word to remember that went along with each. They were then asked to recall that word when shown the face.

After a week’s worth of sessions, most every patient started performing better on the memory task. It’s the first time TMS has been shown to have a long-term positive effect on memory.

In addition, the MRI showed the stimulation caused the brain regions to become more synchronized with each other and the hippocampus. The greater the improvement in the synchronicity or connectivity, the better the performance on the memory test.

“The more certain brain regions worked together because of the stimulation, the more people were able to learn face-word pairings,” said Joel Voss, assistant professor of medical social sciences at Northwestern University Feinberg School of Medicine.

Using TMS to stimulate memory has multiple advantages, noted first author Jane Wang, a postdoctoral fellow in Voss’s lab at Feinberg.

“No medication could be as specific as TMS for these memory networks,” Wang said. “There are a lot of different targets and it’s not easy to come up with any one receptor that’s involved in memory.”

Who knows, maybe someday your iPhone 20b will come with a built-in magnetic coil so you can get a little memory boost to start your day. Then again, magnetic fields that strong probably wouldn’t work so well with handheld electronic devices.

Thank goodness for cloud storage backup.

The study, “Targeted enhancement of cortical-hippocampal brain networks and associative memory,” was published in Science by Voss along with other Northwestern authors Lynn M. Rogers, Evan Z. Gross, Anthony J. Ryals, Mehmet E. Dokucu, Kelly L. Brandstatt and Molly S. Hermiller.

Posted in Northwestern | Leave a comment

A Made-to-Order Materials Menu

aflowlib_screenshotIf you think ordering a drink from Starbucks can be a tall order, try picking out the right material for a new product or experiment. An iced, half-caff, four-pump, sugar-free, venti cinnamon dolce soy skinny latte may be a mouthful, but there are hundreds of thousands of known—and unknown—compounds to choose from, each with their own set of characteristics.

For example, take the family of materials Bi2Sr2Can-1CunO2n+4+x, or bismuth strontium calcium copper oxide for short, if you can call it that. These compounds have the potential to change the world because of their ability to become superconducting at relatively high temperatures. But each sibling, cousin or distant relative has its own physical variations…so how to choose which to pursue?

Thanks to Stefano Curtarolo and his research group, now there’s an app for that.
Curtarolo leads a collection of research groups from seven universities specializing in something they call materials genomics. The group—called the AFLOW consortium—uses supercomputers to comb databases for similar structures and builds theoretical models atom-by-atom to predict how they might behave.

With help from several members of the consortium, Pratt School of Engineering postdocs Cormac Toher, Jose Javier Plata Ramos and Frisco Rose, along with Duke student Harvey Shi, have spent the past few months building a system that combs through four materials databases. Users can choose the elements and characteristics they want a material to have—or not to have—and the website will play matchmaker.

Want a two-element compound containing either silicon or germanium—but not gallium—that is stable enough to withstand high temperatures? Not a problem. How about an electric insulator made from transition metals with a certain crystal structure? AFLOW has you covered.

One of the four databases searched by the program draws from an international collection of compounds with structures known from experimentation. The other three contain single- double- or triple-element compounds, and are not limited to previously explored materials. Through their molecule-building algorithms, the AFLOW consortium is constantly adding prospective materials to these four libraries.

The search engine currently can sort through more than 622,000 known and unknown compounds, and more than 1,000 new ones are added each week. Curtarolo, a professor of materials science and physics, hopes that the open-source program will continue to grow in materials and searchable characteristics to help scientists connect to their ideal material. To see how it works for yourself, take it for a test drive here.

As for an equivalent database of coffee drinks, that has yet to be built. So if you’re looking through AFLOW for a hot, lactose-free drink featuring 150 to 200 mg of caffeine and less than 200 calories made with beans from South America, you’re out of luck.

Posted in Duke | Leave a comment

Chimps Need No Help for Violence

chimpanzee-glockWhoever wrote the script for Dawn of the Planet of the Apes should have done some more homework. The primary antagonist of the film is Koba, a scarred bonobo who holds a grudge against humans for his mistreatment.

The only problem is that bonobos don’t seem to hold a grudge. Nor do they seem to have any violent tendencies whatsoever, for that matter.

The two closest species to humans are chimpanzees and bonobos, and one could argue that violence is one of the largest differences between the three . While humans kill each other for smudging their Pumas and chimpanzees have hundreds of documented instances of in-species killings , bonobos just don’t. Of four different bonobos communities constantly observed, there has been only one killing, and it can only be described as a suspected killing, at best.

This leads to some interesting questions. Did the last common ancestor of all three species have violent tendencies, and bonobos have evolved out of them? Was the last primate common to humans and chimps peaceful, and the two species since evolved same-species violence independently? Or was it some combination of the two?

There is a fourth option that many conservationists have taken to recently—that it is humans that are causing chimps to be aggressive, not their own nature. Perhaps by constricting their habitats, injecting ourselves into their lives, feeding some groups while not others, and otherwise interfering with their natural lives, we are the cause of their killings.

Michael Wilson says, “nope.”

Wilson, lead author of a recent paper published in Nature and a researcher at the University of Minnesota, looked at the question by gathering data from 15 chimp and 4 bonobo communities. Over the past five decades, there have been 152 instances of observed, inferred and suspected killings in 15 chimp communities.

After crunching the numbers, he determined that humans are not the cause of the observed aggression. Besides the best statistical models pointing to natural variables such as population size and density, there is plenty of anecdotal evidence.

The highest killing rate occurred at a relatively undisturbed and never-provisioned site, the least disturbed site had at least two suspected killings, and the site that was rated as the most disturbed by humans had zero.

What’s more, one would think that if humans were the cause, the communities would be getting more violent over time. Despite some claims that this is indeed happening, Wilson found no statistical increase in reports of killings during the past five decades. Sure there are more reported instances, but that’s because there are more communities being watched.

“The most important predictors of violence were thus variables related to adaptive strategies: species; age–sex class of attackers and victims; community membership; numerical asymmetries; and demography,” wrote Wilson in the paper. “We conclude that patterns of lethal aggression in [chimps] show little correlation with human impacts, but are instead better explained by the adaptive hypothesis that killing is a means to eliminate rivals when the costs of killing are low.”

The paper, “Lethal aggression in Pan is better explained by adaptive strategies than human impacts,” was published in Nature by Wilson, along with 29 co-authors at sites across Africa.

Posted in Minnesota | Leave a comment

Sunday Runday Funday Scientifically Proven

beermilefeetA new study from Northwestern University that is making use of all of those new-fangled fitness applications and wearables has revealed an athletic truism that I would have bet money on based on anecdotal evidence—people drink more on days that they exercise.

I know I, for one, do this on a regular basis.

There’s two reasons for this. First, after an hour-long, tough workout, I usually feel I can spare the calories and indulge in an extra Founders Breakfast Stout. Rather than seeing the workout as an accomplishment toward my health, I take the typical American route and use it as an excuse to completely erase the good I just did.

The second reason is simply a matter of convenience. People tend to work out more toward the end of the week and on the weekends. Why is that, you might ask? In my experience, it’s because they have more time on their hands, particularly on Saturday and Sunday. More free time equals more exercise time.

But it also means more time to drink.

In the study, researchers asked 150 study participants ranging in age from 18 to some bad-ass 89-year-olds to record their physical activity and alcohol consumption daily on their smartphones for 21 days. They did this at three different times during the year. Not only did this allow participants to take a break and hopefully take better care of their recordings during their three-week intervals, it also helped account for seasonal variations in exercise and alcohol consumption.

Let’s face it, if the study were done entirely during football season, the results would probably be pretty skewed.

“In this study, people only have to remember one day of activity or consumption at time, so they are less vulnerable to memory problems (outside of blackouts*) or other biases that come in to play when asked to report the past 30 days of behavior,” said David Conroy, lead author and a professor of preventive medicine and deputy director of the Center for Behavior and Health at Northwestern University. “We think this is a really good method for getting around some of those self-report measurement problems.

“We zoomed in the microscope and got a very up-close and personal look at these behaviors on a day-to-day basis and see it’s not people who exercise more drink more — it’s that on days when people are more active they tend to drink more than on days they are less active,” Conroy continued. “This finding was uniform across study participants of all levels of physical activity and ages.”

It’s a vicious cycle. Button down the hatches Monday – Thursday, eat well, drink less, and try to undo all the drinking damage that Friday – Sunday brought. It sounds like I’m not alone.

The study, “Daily Physical Activity and Alcohol Use Across the Adult Lifespan,” was published in Health Psychology by Conroy and Northwestern colleagues Nilam Ram, Aaron L. Pincus, Donna L. Coffman, Amy E. Lorek, Amanda L. Rebar and Michael J. Roche of The Pennsylvania State University.

*comment added by me, not originally quoted from Northwestern

Posted in Northwestern | Leave a comment