Tag Archives: time

When was Jesus born?

Every last one of us leaves our mark on history. Most only make a tiny shallow line for our family and friends to notice. A few leave deep grooves that countless other marks align with. Whether you’re a believer or not, the historical figure of Jesus Christ is inarguably one of the latter. Much of the western world as we know it was shaped by his life and the stories people tell about him, his life philosophy (as we know it, of course), and a religion others built around him.

Image credits Greg Montani.

One of the widest-used calendars in the world today — the Gregorian Calendar — is timed from the birth of this person. It separates history into two large parts: B.C., “before Christ”, and A.D., “anno Domini”, loosely meaning “year of the Lord”. The birth of Jesus is obviously when we shift between the two.

At least, that’s the theory. To the best of our knowledge, Jesus wasn’t born on what we consider to be the 1st anno Domini. With that in mind, though, “the best of our knowledge” on this topic is quite muddy. So roll up your sleeves and let’s dive right into it.

Tweak for glory

For starters, although Christians celebrate the birth of Jesus at Christmas (25th December), we’re pretty sure that’s not actually when it happened. But we can’t say when it happened for sure, either. Part of the problem is that Jesus wasn’t born famous, so nobody actually bothered to record the exact date. There’s also the issue that our current dating system was not even invented yet when it happened, and equating dates between systems is imperfect at best. Factor in the huge spans of time involved here, and accuracy is out of the question.

However, what we do know is that at some point Christianity was an underdog of religions. It had quite an uphill battle gathering new followers in several communities, especially those who were polytheistic. This new, one-god religion was simply very strange to them and the customs they held. People who were better-off were also wary of it, as adopting a new religion would often come with a social cost. Not to mention that following teachings which decried slavery and looked down on riches wasn’t high on the priority list of people who enjoyed owning slaves and being rich.

In the Roman Empire, the largest single community that Christianity was trying to get into at that time, both of these issues were at work at the same time.

So what Christianity did was a little bit of PR. Christmas today is celebrated very close to the winter solstice. Many ancient peoples aligned their celebrations with significant natural events, such as the solstice. Whether this was intentional or not on their part is a very interesting question, but it’s not particularly relevant right now. What is relevant, however, is that by changing dates around a bit, Christian customs would better reflect the pagan ones they were competing against. In other words, it would be more familiar to those it tried to convert. It felt less like a completely new celebration, and more of an updated, reskinned celebration — and, so, easier to accept.

In the case of Rome, the end of December marked the start of Saturnalia. This was a celebration in honor of their god of the harvest (Saturn) and lasted between the 17th and 23rd, roughly. Symbolically speaking, this was a good celebration to try and associate yourself with, as it was customary for everyone to enjoy freedom during this time, so social norms would be laxer, even discarded altogether. Well, to be more specific, Saturnalia saw an inversion of one’s fate.

Slaveowners, for example, would dress, feed, and entertain their slaves like they would a friend. The slaves, in turn, could tell their masters their grievances during this time without fear of reprisal. It was a celebration meant to ‘reset your karma‘, so to speak. Gambling was also allowed on Saturnalia, and carnivals were common. In the grand scheme of things, someone celebrating Christmas would probably stand out far less during Saturnalia than any other time of the year.

This is also probably where we get the custom of gifts during Christmas. Romans exchanged gifts with their friends for Saturnalia, although they were either small figures or gag items, and there most definitely weren’t any trees involved.

Of course, none of this actually proves that Christmas was shifted around the calendar to make it more palatable to pagans. But it’s very likely that it was, because we’re seeing too many coincidences. Further proof that the 25th of December date isn’t true to the historical date of Jesus’ birth is that the Orthodox Church in the Byzantine half of the Roman Empire set the date of Christmas at January 6th. If one church can change the date, why couldn’t another?

The Christmas date origin topic is way broader than I have a taste to get into here, but the Washington Post has a nice breakdown of it here.

Not exactly on time

“Saturnalia” sculpture by Ernesto Biondi, in Buenos Aires. Looks like a fun celebration.

So we already know the birth date is probably off, although we don’t know by how much. The thing to keep in mind here is that the texts which make up books such as the gospel weren’t written while Jesus was around, by people who were around him. They were written some time after — often, a very long time after — by people working mostly off hearsay. It’s not a criticism on their part, it’s just the product of a day when writing was still a rare skill, and par for the course of the time.

This material was also heavily curated, edited, tweaked, and cleaned-up by (probably) well-meaning but (in my opinion) extremely biased and damaging individuals as Christianity evolved into a mainstream religion. A mainstream religion, after all, needs to have some mainstream-able texts, and working in media, I can assure you that the first copy is never that. Large parts of the initial bible were taken out, and what was left was re-ordered and re-worded to better suit individual agendas. It was an ongoing process, not a single event, as most people who sought power through religion wanted a bible that would fit their narrative better than those of others.

But we’ll turn the other cheek to that. I’m not telling you all this to invalidate anyone’s faith. If you believe, you believe. Personally, I don’t. But I think we can all agree, no matter what side of that fence we’re on, that understanding the actual historical facts in the story is a worthwhile pursuit. We are, after all, talking about one of the most influential people in the West, and maybe globally.

I’m also telling you all that so you’ll understand why I don’t particularly rely on the texts themselves for answers. They were maintained by people, and people are both fallible and biased. We’re also talking about thousands of years here, so there was probably a lot of failing and biased behavior involved. In other words, the texts themselves are not a reliable source if what you’re after is to understand what happened and when with accuracy. Not only that, but these are religious texts; they were never intended to preserve chronology, but theology. The dates are not as important as the message, as far as they are concerned.

Back to the year

While religious texts aren’t reliable as direct sources, they do offer useful context. Context which we can then bash against what we know from historical records and archeological digs to hopefully arrive at the truth.

One of the first attempts in this regard was to date the birth of Jesus using the figure of Herod. In the bible, soon after Herod dies, the new ruler of Judea orders all male infants under two years old in the Bethlehem region (where Jesus was born) to be killed. The good news here is that we have a rough timeline for when Herod died: around 4 B.C. The bad news is that that’s not a reliable date by any stretch and that the rest of the story seems to be made-up as well. Still, if we take these at face value, Jesus was likely born between the years 6 and 4 B.C.

The story also holds that Jesus’ birth was heralded by a star — the Star of Bethlehem. It has been proposed that this star was actually a slow-moving comet, one that Chinese observers recorded around 5 B.C. This fits well with our previous estimation, which is a plus, but it also basically boils down to “hey these two events fit so they could be the same”. This isn’t necessarily a wrong conclusion, but it definitely isn’t proof.

Reasonable Theology makes a valiant effort of estimating the birth date of Jesus drawing mostly from scripture here (it’s a pretty interesting read). I’m not that familiar with everything going on in the bible, so I’ll have to take their word for it, but the conclusion they draw from several passages is that Jesus was born sometime between 6 and 5 B.C. This, again, fits with the previous estimation and is a little more reliable as it ties events going on in the story to historical figures such as Emperor Caesar Augustus and Governor Publius Sulpicius Quirinius, which are somewhat well-anchored in history.

It also loosely fits with the Aemilius Secundus inscription, a tablet discovered 300 years ago in Beirut, Lebanon, which tells of a census ordered by Quirinius, the governor of Syria, in 12 B.C., according to biblical scholar Jim Fleming. This census is mentioned in the texts, although different gospels disagree on whether Jesus was born before or after it.

However, there are some grounds to believe that Herod actually died around the year 1 B.C., which would put Jesus’s birth around the year 3 B.C.

All things considered, we can estimate with some certainty that Jesus was born between 6 and 4 B.C., and with less certainty that it happened a few years later. But everybody is pretty confident that he — ironically — was not born in ‘the first year of the lord’.

Since we can’t yet know for sure exactly when it happened, this tiny incongruency will have to stick around for a bit longer. With that being said, our calendars are made so practical issues like historical events or yearly tax records can be kept in an organized fashion that future generations will still be able to use, should they need it. Although we think of years as either before or after Christ, they are primarily a chronological tool, not a theological one.

Carrot.

Farmers actually work more than hunter-gatherers, have less leisure time

New research says that agriculture may not have been the smartest move we ever pulled. The authors of the study report that hunter-gatherers in the Philippines who are transitioning towards agriculture work for significantly longer each day. Women seem to be the hardest hit by this transition.

Carrot.

Image via Pixabay.

A team of researchers led by University of Cambridge anthropologist Dr. Mark Dyble lived with the Agta people, a group of small-scale hunter-gatherers from the northern Philippines who are increasingly engaging in agriculture. The team says that engagement in farming and other non-foraging work resulted in the Agta working harder and for more time every day — in essence, it ate into their leisure time. On average, the Agta that primarily engaged in agriculture worked 10 more hours per week compared to foraging-focused ones. The women living in agricultural communities were especially hard-hit: on average, they only had half as much leisure time as their hunter-gatherer counterparts.

Toils of the earth

“For a long time, the transition from foraging to farming was assumed to represent progress, allowing people to escape an arduous and precarious way of life,” says Dr Dyble, first author of the study.

“But as soon as anthropologists started working with hunter-gatherers they began questioning this narrative, finding that foragers actually enjoy quite a lot of leisure time. Our data provides some of the clearest support for this idea yet.”

The researchers recorded what the Agta were up to at regular intervals between 6 am and 6 pm for every day they were there, across ten Agta communities. Using this data, the team then calculated how 359 Agta managed their time: in particular, they were curious to see how much time they assigned to leisure, childcare, domestic chores, and out-of-camp work per day. Some of the Agta people in the study engaged in hunting and gathering exclusively, while others mixed foraging with rice farming.

Increased engagement in farming and other non-foraging activities was linked to larger workloads and less leisure time, the team reports. On average, the Agta that engaged primarily in farming worked roughly 30 hours per week, while forager-onlys worked around 20 hours, the team estimates. The difference was largely due to women, they add, who had to forgo domestic activities and work in the fields. Women living in the communities most involved in farming had half as much leisure time as those in communities which only foraged.

Both men and women had the lowest amount of leisure time at around 30 years of age, although it kept increasing steadily later on. Overall, women spent less time working outside of the camp, and more on domestic chores and childcare (in-camp activities) than men. All in all, however, both sexes enjoyed a roughly equal amount of leisure time. Adoption of farming had a disproportionate impact on women’s lives, however, as we’ve mentioned above.

“This might be because agricultural work is more easily shared between the sexes than hunting or fishing,” Dr Dyble says. “Or there may be other reasons why men aren’t prepared or able to spend more time working out-of-camp. This needs further examination.”

“The amount of leisure time that Agta enjoy is testament to the effectiveness of the hunter-gatherer way of life. This leisure time also helps to explain how these communities manage to share so many skills and so much knowledge within lifetimes and across generations,” says Dr Abigail Page, an anthropologist at the London School of Hygiene and Tropical Medicine and one of the paper’s co-authors.

However, “we have to be really cautious when extrapolating from contemporary hunter-gatherers to different societies in pre-history,” she adds. “But, if the first farmers really did work harder than foragers then this begs an important question — why did humans adopt agriculture?”

The paper “Engagement in agricultural work is associated with reduced leisure time among Agta hunter-gatherers” has been published in the journal Nature Human Behaviour.

Old and young.

Time flies as we age because our brains get bigger and less efficient, a new paper proposes

New research from Duke University says time flies as we age because of our brains maturing — and degrading.

Old and young.

Image credits Gerd Altmann.

The shift in how we perceive time throughout our lives takes place because our brain’s ability to process images slows down, reports a study penned by Adrian Bejan, the J.A. Jones Professor of Mechanical Engineering at Duke. This is a consequence of the natural development of our brains, as well as wear and tear.

Hardware, oldware

“People are often amazed at how much they remember from days that seemed to last forever in their youth,” said Bejan. “It’s not that their experiences were much deeper or more meaningful, it’s just that they were being processed in rapid fire.”

Bejan says that, as the bundles of nerves and neurons that make up our brains develop both in size and complexity, the electrical signals that encode sensory data have to travel through longer paths. We also grow in size, making the nerves feeding information to the brain physically longer. Nerve fibers are good conductors of electricity — but they’re not perfect; all that extra white matter slows down the transfer of data in our biological computers.

Wear and tear also play a role, he adds. As neural paths age, they also degrade, which further chips away at their ability to transport information.

These two elements combine to slow down our brain’s ability to transport, and thus process, data. One tell-tale sign of processing speeds degrading with age is the fact that infants tend to move their eyes more often than adults, Bejan explains. It’s not that they’re more ‘filled with energy’ or simply have shorter attention spans. Younger brains are quicker to absorb, process, and integrate new information, meaning they need to focus for shorter spans of time on a single object or stimuli to take it all in.

So, how does this impact our perception of time? The study explains that older people basically view fewer new images in a given unit of time than younglings, due to the processes outlined above. This makes it feel like time is passing more quickly for the former.  Objective, “measurable ‘clock time’ is not the same as the time perceived by the human mind,” the paper reads, as our brains tend to keep track of time by how many new bits of information it receives.

“The human mind senses time changing when the perceived images change,” said Bejan. “The present is different from the past because the mental viewing has changed, not because somebody’s clock rings.”

“Days seemed to last longer in your youth because the young mind receives more images during one day than the same mind in old age.”

It’s not the most heartening of results — who likes to hear their brains are getting laggy, right? — but it does help explain why we get that nagging feeling of time moving faster as we age. And, now that we know what’s causing it, we can try to counteract the effects.

That being said, maybe having a slower brain isn’t always that bad of a thing. If you’re stuck out on a boring date, or grinding away inside a cubicle from 9 to 5, at least you feel like you’re getting out quicker. Glass half full and all that, I suppose.

The paper “Why the Days Seem Shorter as We Get Older” has been published in the journal European Review.

Young person smartphone.

Child and teen obesity on the rise as they’re consuming too much… screen time

If you want to see you health improving, stop looking at the screen.

Young person smartphone.

Image credits Paul Henri Degrande.

A new scientific statement from the American Heart Association warns that children and teens should try to wean off of screens. Screen time from any device is associated with an increased amount of sedentary behavior, they explain, which promotes obesity and other health complications associated with lack of physical exercise.

The heart of the issue

Sedentary behaviors — things like sitting, reclining, or laying down while awake — exert little physical energy and contribute to overweightedness and obesity. That’s not exactly news. However, we’re spending more time than ever before with our eyes glued onto screens, and this is especially true for children, teens, and yours truly.

Now, the American Heart Association (AHA) says that this lifestyle poses serious consequences to the health of teens and children.

The new scientific statement — a scholarly synopsis of a topic and official point of view of the emitter — was developed by a panel of experts who reviewed the existing literature on the subject of sedentary behavior’s relation to cardiovascular disease or stroke. The document holds that children and adolescents have seen a net increase in the recreational use of screen-based devices over the last twenty years. While TV-viewing has declined over the same period, those hours were usurped by other devices such as smartphones or tablet computers.

Current estimates are that 8- to 18-year-olds spend more than 7 hours using screens daily, according to the paper. However, the authors caution that almost all of the available scientific literature on this subject relied on self-reported screen time. Very few of the studies looked at which types of devices were used in different contexts, they add. All in all, this means that the studies can’t be used to establish a cause-effect relationship between the use of these devices and the health complications examined as part of the paper.

There is a large body of evidence pointing to the relationship between screen time and obesity, however. Writing for Reuters in late 2016, Lisa Rapaport reported that “a minimum five-hour-a-day [TV time] increased the odds of obesity by 78 percent compared with teens who didn’t have TV time,” and that similarly “heavy use of other screens was tied to a 43 percent greater risk of obesity.”

“Still, the available evidence is not encouraging: overall screen time seems to be increasing — if portable devices are allowing for more mobility, this has not reduced overall sedentary time nor risk of obesity,” says Tracie A. Barnett, chair of the writing group.

“Although the mechanisms linking screen time to obesity are not entirely clear, there are real concerns that screens influence eating behaviors, possibly because children ‘tune out’ and don’t notice when they are full when eating in front of a screen.”

“There is also evidence that screens are disrupting sleep quality, which can also increase the risk of obesity,” Barnett said.

The most important takeaway from the study is for parents and children to try limiting screen time, the authors add. AHA recommends that children and teens get no more than 1 or 2 hours of recreational screen time daily, which the authors also support. Given that younglings already “far exceed these limits,” they add, parents should step up to the plane and be vigilant about their children’s screen time “including phones,” Barnett believes.

Efforts to minimize screen time should center around parent involvement, the team explains. Parents can help push children to reduce the time they spend on devices by setting a good personal example and establish screen-time regulations around the house.

Try to keep screens out of the bedroom (as much as one can do that in the XXIst century), the team adds, as some studies have shown they can interfere with sleep patterns. Also, try to maximize face-to-face interactions and outdoor activities.

“In essence: Sit less; play more,” Barnett explains.

The team says that more research is needed to help us understand the long-term effects of screen time on children and teens. We also don’t really know how to help youngsters be less sedentary — a problem that the appeal of screens aggravates, but doesn’t necessarily cause. Before we can address this imbalance in how children and then choose to spend their time, we need more comprehensive information on the impact of today’s sedentary pursuits.

The paper “Sedentary Behaviors in Today’s Youth: Approaches to the Prevention and Management of Childhood Obesity: A Scientific Statement From the American Heart Association” has been published in the journal Circulation: Journal of the American Heart Association.

Calendar.

You’re doing fun activities wrong — but a new study reveals how to do them right

Scheduled fun isn’t as enjoyable as spontaneous fun, apparently.

Calendar.

Image credits Andreas Lischka.

Your trying to have a good time might just be what’s keeping you from it, a new paper suggests. According to the research, performed by a duo of scientists from the Ohio State University (OSU) and Rutgers Business School (RBS), planning leisure activities ahead of time makes us enjoy them less compared to spontaneous or more loosely-scheduled events.

It’s all about the timing

The paper explains that we tend to subconsciously ‘lump together’ all of our scheduled activity under the same mental group. It doesn’t matter if said activity is going to the dentist, paying your taxes, or a date with a special someone — if it’s scheduled, it goes in the same group. In the end, that makes us more likely to perceive pleasurable activities as chores, the authors explain, draining them of some enjoyment.

“It becomes a part of our to-do list,” Selin A. Malkoc, study co-author and an associate professor at OSU, wrote in an email to The Washington Post. “As an outcome, they become less enjoyable.”

“When scheduled, leisure tasks feel less free-flowing and more forced — which is what robs them of their utility.”

Part of the problem, Malkoc believes, is cultural. We place such a high value on achievement that even fun and contentment become secondary. Most of us live hectic lives, juggling work, school, social events, hobbies, sports, and many other activities that require an investment of time and energy. We jam-pack our schedules, fearing that we will never do all that we want to do if we give ourselves some free time, Malkoc explains. Because of this over-commitment to achievement, “people even strive to make leisure productive and brag about being busy,” the paper explains.

In the end, we do more — but we enjoy all of it less.

The paper builds on a 2016 study published by the two researchers, in which they pooled together data from 13 previous studies conducted on the enjoyment of leisure activities. After analyzing all the results in parallel, the team concluded that scheduling leisure activities — ranging from a carwash, test-driving a car, and watching a fun video — had a “unique dampening effect” on their enjoyment.

In one of the 13 studies, the authors gave students a hypothetical calendar consisting of classes and other activities. Some of the students were asked to schedule a frozen yogurt outing with friends, two days in advance, and add it to the calendar. The rest were asked to imagine they ran into a friend by chance and ended up going to the same frozen yogurt place — but spontaneously. Both groups were later asked to report how they felt about the situation.

The first group — the schedulers — ended up perceiving the event “more like work,” the paper concludes.

So, then, what can we do to enjoy some downtime but still get something done? Malkoc believes “rough scheduling” could be the answer. Boiled down, this approach means setting up plans to meet for lunch or an after-work drink with someone, but not assigning it a time per se. If this loose plan isn’t enough to make the meetup happen, she adds, that may be for the best.

“As trivial as the change might seem, it has an important effect on human psychology: It reintroduces the flexibility to the leisure tasks,” Malkoc wrote in her email to The Washington Post.

“If things don’t work out, in all likelihood at least one of the parties was forcing themselves to make it happen – and thus would enjoy it less. So, maybe things worked out for the best, right?”

Malkoc uses the approach in her own personal life, saying it goes just fine and that her friends “are willing to play along”. Rough scheduling was also the subject of one of the previous studies she and Tonietto performed.

It included 148 college students who agreed to take a break for free coffee and cookies during finals. Half of these students were asked to come in at a specific time for their snack, while the others were given a two-hour window during which they could do so. The first group reported enjoying their break less than those who were given a window, according to the study.

Another piece of advice Malkoc would give is to simply stop trying to fit so many different activities in our schedule. A good place to start from would be to prioritize our enjoyment of activities rather than their quantity, she suggests.

“Be more selective in what we choose to do … take the liberty to let things go,” she concluded in her emails. “This is not to say we should never make plans. But we can prioritize better and let go of our fear of missing out.”

The paper “The Calendar Mindset: Scheduling Takes the Fun Out and Puts the Work In” has been published in the Journal of Market Research.

Credit: Pixabay.

It takes about 200 hours with someone to turn them into a best friend, new study shows

Credit: Pixabay.

Credit: Pixabay.

Humans long to bond with their peers — a fundamental urge, which may be evolutionarily rooted. We are often in the company of other people, be it at school, work, or at home; if we’re not, it can become a problem. A lack of human contact in one’s life can have devastating effects on health, with one study finding loneliness is deadliner than obesity. Conversely, a large social network and a socially engaging life with members of that network are factors that predict overall health and subjective well‐being.

Jeffrey Hall, a professor of communications studies at the University of Kansas, is the author of the Communicate Bond Belong (CBB) theory, which proposes “that a social interaction operates within a homeostatic system, developed from internal pressures to satiate a need to belong, shaped by competing desires to invest and conserve social energy, and adaptable to new social circumstances and technological affordances.” To satisfy this need, people invest time and energy into building bonds. Now, in the new research, Hall and colleagues quantified this temporal dimension, essentially learning how much time it takes, on average, for people’s relationships to evolve.

A time for friendship

In one study, the researchers interviewed 355 adults who had relocated to a new city within the previous six months. This was a great demographic, since the participants were forced by circumstances to build a new social circle, essentially resetting their social setting. Each participant was asked to identify new persons that they had met, who weren’t family members, romantic interests, or people they had previously met. The participants specified where they met the new person and how much time they spent together, on average, in a typical week. Each new person introduced to a social circle was rated on a scale from acquaintance to best friend.

A second study included 112 University of Kansas freshmen — students who were exposed to many opportunities to meet new people and possibly befriend them. The students were asked to name two new acquaintances, and then report back to the researchers three times over the course of nine weeks of school how these relationships had changed.

For both studies, the researchers focused on identified so-called cut-off points, where there was a 50% likelihood that a relationship switched from acquaintance to casual, from casual to friend, or from friend to close friend. In terms of time, it took 50 hours of interaction to move from acquaintance to casual friend, 90 hours from casual friend to friend, and more than 200 hours for a person to fall in the ‘best friend’ frame. Acquaintances that had never moved up the social circle usually had spent fewer than 30 hours together.

The study suggests that time spent together with other people is a highly important metric to establishing meaningful connections. People who move to a new city for study or work and struggle making new friends might want to keep these findings in mind — it shouldn’t feel like work making friends, but it sure does take time.

The researchers also found that spending time together doesn’t automatically turn two people into friends — go figure. Some of the participants reported spending hundreds of hours with colleagues which were still classed as acquaintances at the end of the study. This usually happened when acquaintances didn’t spend leisure time together (outside of school or work).

Hall says that having friends isn’t just a life pleasure, it’s also a necessity. Over the years, much research has shown that friends influence your happiness and habits — whether you’ll smoke or drink, work out, stay thin or become obese. The findings show that making friends is an investment that requires time and a bit of strategy (asking acquaintances to join you in leisure activities outside a formal environment). If you’re the kind of person that struggles to make friends, besides social skills, you might want to evaluate how much free time you set aside every week for seeing friends and building relationships with the new people you’ve met.

The findings appeared in the Journal of Social and Personal Relationships.

Piece of cake.

You can’t chase your happiness and have it too — you must let it come to you

Happiness — while it’s something we all want, new research shows that it shouldn’t be something we pursue.

Piece of cake.

For once, these lame-o feely-goody snaps actually have a point.
Image credits Antonio Quagliata.

The actual pursuit of happiness might be what’s keeping you from it, a new paper reports. People who make a conscious effort to attain happiness often feel like they don’t have enough free time during the day, which, paradoxically, ends up making them feel unhappy. The findings are based on four studies which probed into how the pursuit of happiness and the state of happiness influence people’s perception of time.

Time to be happy

Some of the participants in these studies were asked to list things that would make them happier; the others, to try and make themselves feel happy while watching a (rather dull) video about bridge-building. Later, all participants reported on how much free time they felt they had throughout the day.

The two were meant to illustrate the differences between thinking of happiness as something already achieved, or as a goal to be pursued. For example, the first group got to watch something they liked, not a boring old movie about bridges; they also got to look at a list of items showing them that they already have a lot to be happy about. Group two were told they have to work to feel happy, which naturally implies that they’re not right now.

The results were quite interesting. The team reports that an individual’s pursuit of happiness — which the team cheerily refers to as  “unattainable” — can influence their perceived time scarcity; in other words, those running towards happiness end up feeling like they’re only running out of time. This feeling was lessened, however, for participants who considered that they have achieved happiness to some degree.

“Time seems to vanish amid the pursuit of happiness, but only when seen as a goal requiring continued pursuit,” the researchers explain. “This finding adds depth to the growing body of work suggesting that the pursuit of happiness can ironically undermine well-being.”

They add that this suggests happiness can become a drain on our emotional state, but that it doesn’t have to. If you stop and appreciate the happiness you have achieved — and I think all of us find happiness, large or small, in something — you’ll use your time to appreciate it, rather than endlessly run towards new ‘sources’. The paper also underscores that people have different concepts about what happiness is, and that this will further influence how they perceive time scarcity.

Own less, appreciate more

The researchers also say there’s a more insidious aspect regarding this perceived time scarcity. The worse it gets, the more people start to move away from the things that actually give them happiness, and towards possessions that give the illusion of happiness — forming a vicious cycle.

“Because engaging in experiences and savoring the associated feelings requires more time compared with merely, for instance, buying material goods, feeling a lack of time also leads people to prefer material possessions rather than enjoying leisure experiences,” they explain.

Feeling pressed for time, they add, makes us less likely to help others or to volunteer — and it’s immaterial things like generosity, selflessness, experiences, or simply being part of a community, that make us sustainably happy.

“By encouraging people to worry less about pursuing happiness as a never-ending goal, successful interventions might just end up giving them more time and, in turn, more happiness.”

Still, while the results show that happiness is fleeting when pursued and that it dramatically alters our perception of time, the research doesn’t offer much in the way of why this happens. Considering that our perception of time availability is such a big factor in our day-to-day decisions and quality of life, the team thinks it essential for further research to uncover when, why, and how people budget their time in pursuit of happiness and other goals.

To me, it shows the importance of tempering the need for more, the drive to improve, our natural desire for higher status and a better life, with an appreciation of what we have achieved or have been given, of those we love, of the beauty in things as mundane as a wisp of wind. Happiness, then, won’t lie beyond the next hill — it will be right here with us, making the climb easier.

The paper “Vanishing time in the pursuit of happiness” has been published in the journal Psychonomic Bulletin & Review.

Children Playing.

Who do you spend most of your time with? The answer might surprise you

Who the average American spends his or her time with might surprise you — especially in the later stages of life.

Children Playing.

Image credits Dean Moriarty.

How do you slice up your day, and who gets the largest bite? “Friends and loved ones” are probably the immediate answer most people would give — but that may not be the case, especially for Americans. Thankfully, data scientist Henrik Lindberg also asked himself the same question and with the tools of his trade set out to find the answer.

Using data from the 2003 to 2015 annual census carried out by the US Bureau of Labor Statistics, Lindberg crunched the numbers on how the average American spends his time. His work is undeniably illuminating, but us mere mortals likely find raw figures less palatable than data scientists, so Quartz’s Dan Kopf saved the day by breaking down the findings into a sleek series of charts dealing with six main social circles: friends, family, coworkers, romantic partners, children, and finally, ourselves.

The fact that parents take up the lion’s share of time during our childhood doesn’t come as much of a surprise, nor does the fact that it tends to decrease as we age and leave the nest. Around our 20s, this time increasingly goes towards nurturing friendships, spent with our coworkers, spent in the more pleasant company of our romantic partners.

These last two groups — coworkers and partners — will remain a constant theme throughout our lives. The time spent with our won children makes a roaring debut, plateaus around 50 and then steadily declines. But sometime in our teens, one person makes an appearance in our timetable that will steadily rise to prominence as the one we share companionship with the most, especially after the age of 50 when the young’uns grow up — ourselves.

So does your grandma have a point that you never call? Maybe. But at the same time, older people actually report feeling less stressed and overall higher levels of happiness than people in their 20s. They’re also more emotionally mature, so they can just shrug off stressors which 20-somethings would consider a death-sentence. In other words, while they spend more time by themselves, they’re much better suited to do so and even enjoy it. Being alone isn’t the same as being lonely.

Provided we work to become someone who’ll be good company in our golden years.

Typing fonts.

Each language you speak in alters your perception of time, study finds

Language can have a powerful effect on how we think about time, a new study found. The link is so powerful that switching language context in a conversation or during a task actually shifts how we perceive and estimate time.

Typing fonts.

Image credits Willi Heidelbach.

I think we all like to consider our minds as being somehow insulated from the going on’s around us. That we take comfort in knowing it will absorb, process, and respond to external stimuli in a calm, efficient, but most of all consistent fashion. Maybe it comes down to the sense of immutable truth our reasoning is imbued with if we assume that it’s rooted in a precise and impartial system — in a chaotic world, we need to know that we can trust our mind. A view which is a tad conceitful, I’d say, since it’s basically our mind telling us what to believe about itself.

And it’s also probably false. Professors Panos Athanasopoulos, a linguist from Lancaster University and Emanuel Bylund, a linguist from Stellenbosch University and Stockholm University, have discovered that our perception of time strongly depends on the linguistic context we’re currently using.

Doublespeak

People who are fluent in two (bilinguals) or more (polyglots) languages are known to ‘code-switch’ often — a rapid and usually unconscious shift between languages in a single context. But each language carries within it a certain way of looking at the world, of organizing and referring to things around us. For example, English speakers mark duration of events by likening them to physical distances, e.g. a short lecture, while a Spanish speaker will liken duration to volume or amount, e.g. a small lecture. So each language subtly ingrains a certain frame of reference for time on its speaker.

But bilinguals, the team found, show a great degree of flexibility in the way they denote duration, based on the language context in use. In essence, this allows them to change how the mind perceives time.

For the study, the team asked Spanish-Swedish bilinguals to estimate the passage of time or distance (distractionary task) while watching a screen showing either a line growing across it or a container being filled. Participants reproduced duration by clicking the computer mouse once, waiting the appropriate time, and clicking again. They were prompted to do this task either with the word ‘duración’ (the Spanish word for duration) or ‘tid’ (the Swedish term). The containers and lines themselves weren’t an accurate representation of duration, however, but were meant to test to what extent participants were able to disregard spatial information when estimating duration.

The idea is that if language does interfere with our perception of duration, Spanish speakers (who talk about time as a volume) would be influenced more by the fill level of the containers than their Swedish counterparts (who talk about time as a distance), and vice-versa for the lines.

And it did

Image credits emijrp / Wikimedia.

The team recruited 40 native Spanish and Swedish bilinguals each and had them run three variations of the test. The first one found that Spanish native speakers were influenced to a greater extent (there was a wider discrepancy between real and perceived time) by the containers than the lines (scoring an average 463-millisecond discrepancy vs the Swedes’ 344 ms). Native Sweedish speakers were more influenced by the lines than the containers (scoring 412 discrepancies vs their counterparts’ 390 ms discrepancies).

The second test again included 40 of each group and found that in the absence of the Spanish/Sweedish prompt words, the team “found no interaction between language and stimulus type, in either the line condition or the container condition. […] both language groups seemed to display slightly greater spatial interference in the lines condition than in the containers condition. There were no significant main effects.”

The third test included seventy-four Spanish-Sweedish bilinguals who performed either the line or container task. The team removed the distractor task to reduce fatigue and alternated between the prompt languages. Each participant took the experiment twice, once with Spanish and once with Swedish prompt labels. The team concludes that “when all stimuli were analysed,” there were “no significant main effects or interaction” either in the distance or time task — meaning both groups were just as accurate in estimating time or distance regardless of language.

“Our approach to manipulate different language prompts in the same population of bilinguals revealed context-induced adaptive behavior,” the team writes. “Prompts in Language A induced Language A-congruent spatial interference. When the prompt switched to Language B, interference became Language B-congruent instead.”

“To our knowledge, this study provides the first psychophysical demonstration of shifting duration representations within the same individual as a function of language context.”

Exactly why this shift takes place is still a matter of debate: the team interprets the finding in the context of both the label-feedback hypothesis and the predictive processing hypothesis, but mostly in technical terms for other linguists to discern. For you and me, I think the main takeaway is that as much as our minds shape words so do words shape our minds — texturing everything from our thoughts to our emotions, all the way to our perception of time.

The paper “The Whorfian Time Warp: Representing Duration Through the Language Hourglass” has been published in the Journal of Experimental Psychology.

Editor’s note: edited measured discrepancy for more clarity.

Tardis Pinball set.

Time travel is proven possible — but we’ll likely never be able to build the machine, author says

New research from the University of British Columbia, Okanagan comes to validate the nerdiest of your dreams. Time travel is possible according to a new mathematical model developed at the university — but not likely anytime soon. Or ever.

Tardis Pinball set.

Image credits Clark Mills.

The idea of modern time traveling machine has its roots in HG Wells’ Time Machine, published way back in 1885. Needless to say, it has enraptured imaginations all the way up to the present, and scientists have been trying to prove or disprove its feasibility ever since. One century ago, Einstein was unveiling his theory of general relativity, cementing time as a fourth dimension and describing gravitational fields as the product of distortions in spacetime. Einstein’s theory only grew in confidence following the detection of gravitational waves generated from colliding black holes by the LIGO Scientific Collaboration.

So time isn’t just an abstract, human construct — it’s a dimension just as real as the physical space we perceive around us. Does that mean we can travel through time? Ben Tippett, a mathematics and physics instructor at UBC’s Okanagan campus, says yes. An expert on Einstein’s theory of general relativity, sci-fi enthusiast and black hole researcher in his spare time, Tippett recently published a paper which describes a valid mathematical model for time travel.

“People think of time travel as something as fiction,” says Tippett. “And we tend to think it’s not possible because we don’t actually do it. But, mathematically, it is possible.”

Tippett says Einstein’s division of space in three dimensions with time as a fourth, separate dimension, is incorrect. These four facets should be imagined simultaneously, he adds, connected as a space-time continuum. Starting from Einstein’s theory, Tippett says that the curvature of space-time can explain the curved orbits of planets around stars. In ‘flat’ (or uncurved) space-time, a planet or a star would keep moving in straight lines. But in the vicinity of a massive stellar body space-time curves, drawing the trajectories of nearby planets and bending them around that body.

Tippett proposes using such a curvature to create a time machine. The closer one gets to a black hole, he says, time moves slower. So if we could find a way to recreate that effect and bend time in a circle for the passengers of the time-machine, we can go back or forward in time.

Tippett created a mathematical model of a Traversable Acausal Retrograde Domain in Space-time (TARDIS). He describes it as a bubble of space-time geometry which carries its contents backward and forwards through space and time as it tours a large circular path. The bubble moves through space-time at speeds greater than the speed of light at times, allowing it to move backward in time.

But although it’s possible to describe the device using maths, Tippett doubts we’ll ever build such a machine.

“HG Wells popularized the term ‘time machine’ and he left people with the thought that an explorer would need a ‘machine or special box’ to actually accomplish time travel,” Tippett says.

“While is it mathematically feasible, it is not yet possible to build a space-time machine because we need materials–which we call exotic matter–to bend space-time in these impossible ways, but they have yet to be discovered.”

The paper “Traversable acausal retrograde domains in spacetime” has been published in the IOPscience journal Classical and Quantum Gravity.

Sunrise

The Earth is spinning slower, making the days longer and longer

Days are getting longer as the Earth’s rotation suffers tiny alterations over time. But you’re probably not going to notice anything anytime soon — a day gets one extra minute every 6.7 million years, a new study estimates.

Sunrise

Image via Pexels.

Each “day” is the amount of time it takes for the planet to do a full rotation around its own axis. So any shift in the speed of Earth’s rotation will have an inverse and proportional effect on the length of a day — higher speeds shorten the day, slower rotation means longer days.

And the latter case seems to be true. A British team estimates that the average day has gained 1.8 milliseconds each century over the past 2700 years. The speed they calculated is “significantly less” than previous estimates which settled on a rate of 2.3 ms per century — which would translate to one minute every 5.2 million years. Still, retired Royal Greenwich Observatory astronomer and lead co-author Leslie Morrison admits that it remains “a very slow process.”

“These estimates are approximate, because the geophysical forces operating on the Earth’s rotation will not necessarily be constant over such a long period of time,” he added.

“Intervening Ice Ages etcetera will disrupt these simple extrapolations.”

The previous figure of 2.3 ms was estimated from calculations of the Moon’s effect on “Earth-braking” — its gravitational force pulls on the Earth’s water and land, effectively pulling against the force of rotation.

But Morrison and his team also factored in gravitational theories about the Earth’s movements around the Sun as well as the Moon-Earth interactions, to calculate the timing of solar eclipses over time as seen from our planet. They then calculated where on Earth they’d be visible from, and compared the results to records of eclipses from ancient Babylonians, Chinese, Greeks, Arabs and medieval Europeans.

“We obtained historical, relevant records from historians and translators of ancient texts,” explained Morrison for the AFP.

“For example, the Babylonian tablets, which are written in cuneiform script, are stored at the British Museum and have been decoded by experts there and elsewhere.”

They found discrepancies between the points eclipses should have been observable from and where they were actually seen. This discrepancy can only be caused by a rotational speed different from the one that the team used in their model (the present one.)

“This discrepancy is a measure of how the Earth’s rotation has been varying since 720 BC” when ancient civilisations started keeping eclipse records, they wrote.

Earth’s rotation speed can be influenced by the Moon’s breaking effect, electro-magnetic interaction inside the planet (between the solid core and the mantle that floats over it), as well as mass shifts on the planet — changes in sea level, shrinking polar caps since the last Ice Age, large reservoirs, and so on.

And while most of us probably won’t ever notice this increase, it’s vital that scientists know about it. This information can be used in adjusting high-precision clocks for example, which underpin our navigational systems.

The full paper “Measurement of the Earth’s rotation: 720 BC to AD 2015” has been published in the journal Proceedings of the Royal Society A.

Consciousness comes in “slices” roughly 400 milliseconds long

A new model proposed by EPFL scientists tries to explain how our brain processes information and then makes us consciously aware of it. According to their findings, consciousness forms as a series of short bursts of up to 400 milliseconds, with gaps of background, unconscious information processing in between.

Image via pixabay user johnhain

Subjectively, consciousness seems to be an uninterrupted state of thought and senses giving us a smooth image of the world around us. So to the best of our knowledge, sensory information is continuously recorded and fed into our perception; we then process it and become aware of it as this happens. We can clearly see the movement of objects, we hear sounds from start to end without pause, etc.

But have you ever found yourself reacting to something before actually becoming aware of the need to react? Let’s say you’re running and trip over, but you change your motions to prevent falling almost automatically. Or you’re in traffic, the car in front of you suddenly stops and you slam on the brakes instinctively, even before you realize the danger. If yes, you’ve most likely said “thanks reflexes” and left it like that.

This, however,  hints at processes that analyze data and elaborate responses without our conscious input, sparking a debate in the science community that goes back several centuries. Why does this automated response form — just as an extra safety measure? Or rather, because your consciousness isn’t always available when push comes to shove? In other words, is consciousness constant and uninterrupted, or more akin to a movie reel — a series of still shots?

Michael Herzog at EPFL and Frank Scharnowski at the University of Zurich now put forward a new model of how the brain processes unconscious information, suggesting that consciousness arises only in intervals up to 400 milliseconds, with no consciousness in between. By reviewing data from previously published psychological and behavioral experiments on the nature of consciousness — such as showing a participant several images in rapid succession and asking them to distinguish between them while monitoring their brain activity — they have developed a new conceptual framework of how it functions.

They propose a two-stage processing of information. During the first, unconscious stage, our brain processes specific features of objects such as color or shape. It then analyzes these objects with a very high time-resolution. But crucially to the proposed model, there is no actual perception of time during this phase — even time-dependent features such as duration or changes in color are not perceived as such. Time simply becomes a value assigned to each state, just as color or shape. In essence, during this stage your brain gathers and processes data, then puts them into a spreadsheet (a brainxcell if you will,) and “time” becomes just another value in a column.

Then comes the conscious stage: after unconscious processing is completed the brain renders all the features into our conscious thought. This produces the final picture, making us aware of the stimulus. Processing a stimulus to conscious perception can take up to 400 milliseconds, a considerable delay from a physiological point of view. The team focused their study on visual perception alone, and the delay might vary between the senses.

“The reason is that the brain wants to give you the best, clearest information it can, and this demands a substantial amount of time,” explains Michael Herzog. “There is no advantage in making you aware of its unconscious processing, because that would be immensely confusing.”

This is the first time a two-stage model has been proposed for how consciousness arises, and it may offer a more refined picture than the purely continuous or discrete models. It also provides useful insight into the way our brain processes time and relates it to our perception of the world.

The full paper, titled “Time Slices: What Is the Duration of a Percept?” has been published online in the journal PLOS Biology and can be read here.

 

All of 2015’s weather, in a stunning 4K time-lapse video.

The European Meteorological Satellite Organization (EUMETSAT) in collaboration with the Japan Meteorological Agency (JMA) and the National Oceanic and Atmospheric Administration (NOAA) just released a time-lapse 4K video of the weather of 2015 — and it’s awesome.

Image from youtube video.

We’ve had our share of wild weather last year — the drought in California, hurricane Patricia or the staggering dust storms in the Middle East among the more extreme examples — most of it fueled by shifting climate patterns. But for all the destruction it can unleash, weather is usually a gentle mistress. Thanks to EUMETSAT, you can now relive the terrifying events as well as the pleasant days that don’t make it to the headlines, in an amazing high-definition video from space.

Using geostationary satellite imagery compiled from EUMETSAT, JMA and NOAA satelites, the video brings 365 days of data in one stunning reel of 2015’s weather.

The northern hemisphere set a record for the most major tropical cyclones to form in a year, so keep an eye on the tropics throughout the video. Around the 6:30 mark, Hurricane Joaquin, the strongest Atlantic hurricane of the year, starts to form. This category 4 storm battered the Bahamas and the East Coast before raging all the way across the Atlantic into the U.K. At the 6:55 mark, Hurricane Patricia hits Mexico’s west coast before heading inland into Texas.

Beyond these destructive events the wider-reaching weather systems of our planet can be observed. During the Amazonian rainy season, lasting from December through April, clouds pop up over the region almost daily — but as the dry season sets in, they become far less common. Weather patterns roll over continents and oceans — a storm in the Southeastern U.S. today becomes next week’s rain in Spain.

EUMETSAT improved the image quality since last year, updating to a 4K resolution for incredible detail. This is possible due to the better quality satellites that both Japan and EUMETSAT launched into orbit last year. As NOAA plans to launch their own brand-new high resolution geostationary satellite this year, we can look forward to even sharper images in future videos.

Well, that and improved forecasts — but I’m always sold on eye candy.

Is the modern life really busier? Not really, Oxford lab finds out

Armed with almost 1 million diary entries, an Oxford-based laboratory is trying to figure out why modern life seems so hectic. Is it really, or is it a matter of perspective?

Ales Krivec.

Ales Krivec.

In the early 1960s, televisions were making their big breakthrough. In the UK and the US everybody and their grandma was buying a TV so the BBC wanted to know what were the best times to air its programmes. They decided to conduct a nation-wide study and asked almost 2,500 people to write in a diary what they’re doing every half an hour, and to indicate whether the TV or radio is on.

The result is a sum of entries like “8 a.m., Eating breakfast,” read one; “8.30 a.m., Taking children to school; 9 a.m., Cleaning away, washing up and listening to Housewives’ Choice” — a popular radio record-request programme of the day. So why is this significant now?

Well, the Centre for Time Use Research at the University of Oxford, UK gathered these results and many others like them in order to see what people really do with their time – and whether or not modern life is as hectic as we generally consider it. They gathered similar diaries from 30 countries, spanning more than 50 years and cover some 850,000 person-days in total.

“It certainly is unique,” says Ignace Glorieux, a sociologist at the Dutch-speaking Free University of Brussels. “It started quite modest, and now it’s a huge archive.”

While analyzing the results is still a work in progress, some results are already apparent: first of all, we’re not busier now than we used to be.

“We do not get indicators at all that people are more frantic,” says John Robinson, a sociologist who works with time-use diaries at the University of Maryland, College Park. In fact, when paid and unpaid work are totted up, the average number of hours worked every week has not changed much since the 1980s in most countries of the developed world.

Source: Top: J. Gershuny/Oxford Centre Time-use Res.

So if we aren’t actually busier, then why do we feel like we’re busier? Well, for starts, it could be us that’s to blame. People tend to grossly overestimate how much they work, in the United States, by some 5–10% on average. Those who work very long hours tend to overestimate even more: for example, people who guess that they work 75-hour weeks, for example, can be over by more than 50%. Similar patterns have been reported in Europe.

But some demographics are indeed working longer than they did a few decades ago. Single parents are the ones that stand out the most – they put in extraordinarily long hours, compared to the majority. The other category is well educated professionals. It’s especially the societal and work pressure that makes these people work in more than they used to.

“The combination of those pressures has meant that there is this group for which time pressure is particularly pertinent,” says Oriel Sullivan, a sociologist who now co-directs the centre with sociologist Jonathan Gershuny.

This collection is also a gold mine for physicians. Not only will it allow us to better understand how our time management has changed over the years, but it will provide insight into some health conditions exacerbated by the modern lifestyle. The diaries “were the greatest asset I could possibly have”, says physiologist Edward Archer at the University of Alabama at Birmingham, who used the data in a 2013 study of obesity. He found that modern, sedentary activities (most notably sitting in front of a computer) were among the main factors contributing to widespread obesity in the developed world. The team estimated that working women today are burning some 130 kilocalories per day less than those in the 1960s.

Now, the team is trying to increase their collection, adding entries from countries such as China, India and South Korea. Then, they can examine cultural differences and see what trends are universal, and estimate how things will develop in the future. There is an abundance of data, and it will definitely take a lot of time to analyze it all.

 

Ring patterns form in a micro-colony of engineered bacteria. Credit: Stephen Payne, Pratt School of Engineering, Duke.

Bacteria growth limited by time, not only concentration. Revises 1950’s Alan Turing theory

How do organs such as the heart or kidneys know when to stop growing? A number of theories have been proposed to answer this, the most entrenched of which dating back from 1952, when the infamous Alan Turing used math to show how biological cell patterns form and how these knew when to stop division. Turing envisioned that the cells knew when to stop growing based on their concentration in a certain location. Researchers at Duke University designed a gene circuit to coax bacteria to grow in a predictable ring pattern. Their findings suggest that the bacteria can sense their environment and that the  engineered gene circuit functions as a timing mechanism. Counter to established theories, the findings may have profound implications in biotechnology.

Ring patterns form in a micro-colony of engineered bacteria. Credit: Stephen Payne, Pratt School of Engineering, Duke.

Ring patterns form in a micro-colony of engineered bacteria. Credit: Stephen Payne, Pratt School of Engineering, Duke.

The team of researchers were led by associate professor of biomedical engineering Lingchong You.

“Everywhere you look in nature there are patterns, many of them very beautiful and even inspirational,” said You. “Our work adds another dimension to the general principles of pattern formation.”

Alan Turing was very fascinated about pattern formations in nature and fractals. The scientist was a particularly gifted mathematician with a keen eye for patterns. It’s worth noting, that Turing, among other scientific contributions of invaluable worth (Turing machine), was the man who broke the Nazi Enigma code, shortening WWII. Unfortunately, Turing was disgraced by his home country due to his sexual orientation, fact that led to his regrettable suicide.

Nevertheless, Turing’s legacy with biological patterns still lives on. In the 1950’s, he imagined that biological patterns are formed due to interactions of certain chemicals he called  “morphogens” that initiated and directed patterns by triggering on- or off-switches. Using math, Turing showed that morphogens could move in space, revealing patterns that mimic those seen in animal skins and leaf shapes. His model became the de facto leading theory regarding biological pattern formation.

A new dimension to bio-pattern growth: time

After using molecular biology lab techniques, however, You and colleagues were unable to replicate a biological pattern predicted by Turing’s model. The Duke researchers engineered a version of the favored lab pet bacteria, E. Coli, to produce two molecules. One serving as the “on” switch promoting colony growth and replication, and the other acting as an “off” switch that halted growth prompted by increase concentration of “on” molecules.

To better analyze colony growth and pattern formation, the researchers also engineered the bacteria to produce fluorescent colours. The ensuing patterns didn’t behave as the scientists initially predicted, though. Instead, the colonies were much smaller than the research team expected based on how fast the “on” signal should diffuse.

To solve the mystery, the scientists added a high concentration of the “on” signal to the growth chamber, flooding the bacteria with the signal. The bacteria formed the same distinctive ring pattern over the same time, which showed they weren’t responding to changes in the concentration of the “on” signal in space.

The only viable explanation, it seemed, was that the “on” molecules acted as a timing cue.  A mathematical model of the timing mechanism was made in order to test this idea. This model predicted how the cells would respond to changes in the size of their growth chamber. This was later confirmed by experiments.

“By serving as a timing cue, the morphogen ‘on’ signal enables the system to sense and respond to the size of the environment,” said You. “The larger the area, the longer it takes for the morphogen to accumulate to a high enough concentration to trigger pattern formation. As such, a larger area will lead to a larger ring pattern.”

Next, the researchers plan on using other gene circuits to create more intricate patterns. Using this technique and armed with new found knowledge on how bio patterns form, scientists could make finely tuned scaffolds for the production of new materials, such as thin metal films for energy applications.

The findings were presented in a paper published in the journal Molecular Systems Biology.  

Nuclear clocks will keep track of time at an unprecedented level of accuracy. The white rabbit from Alice in Wonderland would have most likely been interested in this research.

Nuclear clocks set to become most accurate timekeepers on Earth. Only a fraction of a second lost for 14 billion years

Nuclear clocks will keep track of time at an unprecedented level of accuracy. The white rabbit from Alice in Wonderland would have most likely been interested in this research.

Nuclear clocks will keep track of time at an unprecedented level of accuracy. The white rabbit from Alice in Wonderland would have most likely been interested in this research.

Atomic clocks are the current most accurate time and frequency standards, capable of operating with an uncertity of only a second in millions of years. A new research currently in the work by scientist from the University of New South Wales seeks to track time with an unprecedented accuracy of a mere 20th a second in 14 billion years, 100 times more accurate than an atomic clock.

Atomic clocks work by tracking the orbit of electrons, essentially using them as a sort of pendulum. The researchers suggest they can reach a  hundredfold increase in accuracy by employing an alternate solution. They propose using lasers to orient the electrons in an atom in such a manner that the clock could actually track neutrons orbiting around the atom’s nucleus. The proposed single-ion clock, or nuclear clock, would thus be accurate to 19 decimal places or by a twentieth of a second over 14 billion years, roughly the age of the Universe.

Electrons are subjected to slight external interference, which cause a meager, yet important,  inaccuracy in atomic clocks. Neutrons orbit extremely close to the nucleus, which makes them almost immune to interference. Currently, atomic clocks are the world’s timekeeping standard, and are widely used in a range of applications, from GPS navigation systems, to high-bandwidth data transfer, to govermental timing synchronization, to system synchronization in particle accelerators, where even a nanosecond error needs to be cleared.

“This is nearly 100 times more accurate than the best atomic clocks we have now,” says professor Victor Flambaum of the University of New South Wales.

“It would allow scientists to test fundamental physical theories at unprecedented levels of precision and provide an unmatched tool for applied physics research.”

No word has been given so far concerning when the researchers will actually build the first nuclear clock, however their findings are expected to be published in an upcoming paper in the journal It’s not clear just yet if or when the researchers plan to construct such a clock, but their findings are set to be published in the industry journal Physical Review Letters

Japan Earthquake causes Earth axis to tilt – shortens day!

As a consequence of last Friday’s devastating earthquake near the coast of Japan, the nation’s most powerful recorded earthquake to date actually since it began archiving results in the late 1800, scientists have assessed that the Earth axis has tilted by a few inches and that the chronological day has been shortened by a millionth of a second.

The link and explination of the phenomena from above to the recent earthquake/tsunami is that the shift of tectonic plates beneath the Pacific Ocean opened up a crack about 250 miles long, causing a good portion of Earth’s crust to tumble inside.

“There was a redistribution of an enormous amount of the Earth’s crust,” theoretical physicist Michio Kaku, with the City College of New York, told Discovery News. “It actually shortened the time of the day, and also shifted the axis of the Earth.”

“We know how much the Earth contracted as consequence of that, then you do the math,” Kaku said.

We all live busy lives, however I don’t think someone’s going to miss that millionth of a second we just lost for the days to come. What might prove to have the slightest of effects is the Earth’s axis tilt, which scientists apparently link to ice ages.

“There is still a scientific debate as to what causes ice ages,” Kaku said. “The leading theory is that there are tiny perturbations in the axis of the Earth as it turns around the sun that accumulate with time. These small shifts, this wobbling of the axis of the Earth may in fact cause ice ages.”

“Every century, we have several of these monster earthquakes so it’s hard to estimate exactly how much of an impact this earthquake would have on an ice age, for example,” Kaku added.

[image via clipartheaven. story via discovery]

Obtaining 2 dimensional time – fact or fiction?

time cut

For almost a century, scientists have been trying to figure out what’s good and what’s bad about Einstein’s vision of the universe. Blending the traditional view of physics with the bizarre almost impossible to understand world of quantum physics is no easy task – for anybody. It’s due to our current understanding of these phenomena that we believe it is possible to communicate instantly and to be in two places at once.The effort to unify the views has resulted in a stream of elaborate theories and ideas that mix together in a series of theories, of which the most notable is the string theory and its successor, M-theory. Actually, the M-theory unifies the five superstring theories and a type of supergravity. Recently, however, it seems there is a new variable to the equation; Itzhak Bars, a theoretical physicist at the University of Southern California, thinks these hypotheses are missing a crucial ingredient: an extra dimension of time.

So what we would have by adding a second dimension of time and a fourth dimension of space to the vision Einstein created, he came up with a new model that provides additional information that was previously impossible to find out. He claims this model could provide the answer to how nature works. A lofty goal, I would add. The thing is, physicists have never added a 2nd dimension to time because it opens the possibility of traveling back in time and (perhaps even more tricky than that) introduces negative probabilities and other things that would seem to be non sense. So could there actually be a 2nd dimension of time?

According to Bars, the answer would be “Yes, but only indirectly”. That means that we should think about the world as we know it as some shadows, which are seen differently from different perspectives.

“The predicted relations among the different shadows contain most of the information about the extra dimensions,” he explains.

The next step would be for him and his team to develop some tests in order to find out if their theory is right or not; and it’s exactly what they’re doing, and also adding gravity. If everything goes according to plan, this could mean adding two dimensions of time to the M-theory and finding the “answers” that have eluded us so far.