Could Wearable Technology Make Football Safer?

The NFL’s 2017-18 season came with its fair share of injuries, ranging from the more common — like torn ACLs, rotator cuff issues, and some complications from reparative surgeries. However, one incident stands out from the rest, and that is the head and spinal injury acquired by Steelers linebacker Ryan Shazier during their match with their long-time rivals, the Cincinnati Bengals. The moment Shazier collapsed to the ground clutching his back is one that will haunt all who witnessed it, both in the stands and at home.

Thankfully, the Steelers linebacker is well on the road to recovery, as he is finally home from the hospital and able to stand without assistance. Furthermore, he is determined to support his team in the upcoming season, one way or another.

One benefit to such incidents is that they inspire researchers in the medical and tech fields to seek new methods of tracking and protecting athletes from such debilitating, career-ending — and even life-threatening — injuries.

So far, wearable technology seems to be the most fitting method of achieving this goal. At one time, such devices were simply utilized to track athletes’ steps, sleep patterns, and overall performance, aiding coaches in determining the best methods of improving practice sessions and strategies. Now, however, these sensors can be utilized to measure athletes’ stamina, as well as the implications of the many collisions they encounter during any given game.

In order to discreetly collect this pertinent data, researchers tested the utilization of sensor-filled mouth guards, which included gyroscopes and accelerometers. The gyroscopes were used to track where a given player’s head was in space, offering insight into how players were holding their heads and how those positions correlated with their stamina (e.g., players who held their heads down, thus exposing the crowns of their heads and spine to injury, were often tired). The accelerometers, on the other hand, detected a player’s speed and movement around the field.

These tools were effective in showing researchers not only where players were hit, but how hard they were hit as well. If a given player was hit with a particularly intense blow, the sensors would notify the coaches via Bluetooth, signaling the player ought to be taken off the field and examined.

Although it is difficult to determine just when such devices would be employed on the field, it is encouraging to see that serious injuries are finally being viewed and treated as such, and that players’ health and wellbeing are finally being prioritized by the medical community. Hopefully, these developments will soon encourage coaches and strategists to follow suit, rather than pushing injured players back on the field for the sake of winning a game.

How AI Bots Are Becoming More Human

In the middle of Google Headquarters, something unbelievable — and almost science fiction-like — is occurring. Robots are studying human beings. They are memorizing the ways in which humans interact, perform mundane tasks, eat, drink, sleep, entertain themselves with household and other tech objects — everything we do on a daily basis.

However, this is not a part of an elaborate scheme to take away jobs from human employees, nor a plot to integrate artificial intelligence bots into our society. On the contrary, Google researchers are merely training these artificial intelligence bots to better identify and respond to human actions and interactions.

In the past, Google researchers have made extraordinary breakthroughs in the development and improvement of artificial intelligence. However, every version of every bot has struggled with completing the same tasks: identifying and classifying objects, humans, and the ways in which other humans interact with members of those categories.

With that fact in mind, Google has implemented an intriguing method of educating their bots to more effectively and efficiently identify human interaction — a tactic that entails hours of binge-watching movies.

Over the past several years, researchers have curated a catalog of over 57,000 movie clips — which feature over 96,000 human beings in total — from around the globe. Throughout any given day, these artificial intelligence bots go through the catalog and practice identifying and categorizing different actions and interactions.

Thanks to this method, the bots are getting a clearer picture of human interaction than ever before. This is because human action and interaction is less clear and identifiable in real time than it is in productions, hence why the bots have struggled so greatly to categorize them while observing live subjects.

Now, if this experiment gives Google’s researchers the results they so desire, it may not only improve the bots’ abilities to recognize human action and interaction, but eventually realize the purpose of their actions, as well as the goals humans are trying to reach through said actions. This could eventually aid Google’s bots in targeting potential consumers based on their actions online and in person.

However, it is important to note that such achievements are likely a long way off, as artificial intelligence is still very much in its infancy. However, it will be intriguing to see what other tasks artificial intelligence bots may be able to perform once they are deeply familiar with not only our actions, but our intentions as well.

Facial Recognition in Airports

facial recognition in airports by steve moyeRemember the first time you posted a photo on Facebook, and the algorithm figured out who you were trying to tag before you could even type in their name?  Blew your mind, right?  Well, facial-recognition technology has only gotten more sophisticated, and I recently read an article about how it’s on the path to being used in airports.  The TSA has already begun testing facial recognition systems at the Dulles and JFK Airports.  Face-reading check-in kiosks will be appearing at both Ottawa International and London Heathrow later this year, comparing faces captured at security screenings.  A new project called Biometric Exit is now set to bring this system to every international airport in the US.  

Biometric Exit would use facial matching systems to identify every visa holder as they leave the country.  Here’s how it would work: passengers would have their photos taken immediately before boarding, which would be matched with the passport-style photos provided with the visa application to see whether or not the visitor entered the country illegally.  While it’s currently being tested on a single flight, the Trump administration has plans to expedite its usage until it’s being used for every flight and border crossing in the US.  Speaking at the Border Security Expo last week, US Custom and Border Protection’s Larry Panetta spoke of the importance of facial recognition.  

Biometric Exit, or at least some form of it, has been discussed for decades.  However, it’s only recently that facial recognition has been named as the method of choice.  Fingerprint and iris-based systems have been named as well, but ultimately facial recognition has been preferred, since it’s the easiest.  Although Customs and Border Protection agents take photos and fingerprints from every visa holder entering the country, no such measures exist to see if somebody’s left the country before their visa expires.  According to Homeland Security, roughly half a million people overstay their visas to the US each year, although they can’t determine who these people are without any verifiable exit process.  This is where the Biometric Exit would come in.  Trump has made the program a large part of his aggressive border security policy.  

For this system to work, it will require a robust method for checking passengers’ faces against outside datasets, yet as that system is shared with more agencies, it might be used for a lot more.  Such technology could be shared with land borders and even private airlines.  There are still some technical challenges, and it’s as of yet unclear how well the system works with existing in-airport surveillance system,s but sharing the backend with CBP could make the system a whole lot more efficient.  However, such systems raise a lot of difficult civil rights questions, especially if the FBI is integrated into this system.  

While the Customs and Border Protection said that Biometric Exit would be meant to benefit travelers and still not disrupt travel, concerns have been raised about racial bias.  American facial recognition systems are typically trained on mostly white subjects, so they’re a lot less accurate when scanning other races.  If such a bias isn’t corrected, then it could be a major civil rights issue, especially since visa holders tend to be younger and less white than the average US population.  As the program’s growth is expedited by Trump, such questions become more and more urgent.  

Improving Office Communication

improving office communication by steve moyeBusinesses focus a lot on how to communicate with their customers, yet that doesn’t mean anything if they can’t tackle internal communication.  If there isn’t any healthy dialogue between the people in your organization, then that needs to change.  Communication fosters morale, independence, collaboration, execution and diversity, altogether forming a stronger company.  I recently read an article that shared five tips for improving communication.  Here’s what it had to say:

Small talk: Small talk is simple enough, but it goes a long way by helping build trust.  It might seem cheap and pointless, yet it makes people feel more comfortable with one another, which opens the door for more meaningful conversations in the future.  Next time you see somebody you don’t know well, talk to them, whether it’s about how their day is going, a game they watched or maybe their plans for the weekend.  

Team building exercises: If there isn’t much communication in your company, then maybe trying a direct approach could help.  One great way to do that is through team building exercises where coworkers need to collaborate.  These are especially useful for building relationships and improving communication in new teams.  

Clear communication channels: Sometimes people are bad at communicating with each other simply because they don’t know how to.  Every individual in an organization should know who they’re reporting to, who reports to them and how that looks in the larger company structure.  When this is clarified, then sharing ideas and collaborating becomes a lot easier.  

Feedback loops: Traditional annual performance reviews are outdated; the workers of today need instant feedback, not annual reports.  A regular “feedback loop”, simply a process that defines how actions are evaluated and assessed, is essential for getting constructive criticism and encouragement on a regular basis.  Most feedback loops regularly happen in daily conversation.  

An open-door policy: While most people think of an “open-door policy” in the literal sense, it goes far beyond a boss keeping their door open throughout the day.  It means that anybody in the organization can talk to anybody whenever they need to.  It’s the idea that the new intern can ask the CEO a question.