Thursday, April 30, 2015

Oculus Rift-Based System Brings True Immersion To Telepresence Robots


DORA Telepresence Robot Gives You Fully Immersive Experience

Remote presence robots, as the name implies, act as your stand-in at a distant location, letting you move around and see and hear through a robotic surrogate. Space agencies, researchers, and the military have developed high-end telepresence systems that offer an immersive experience, but these units can cost millions of dollars. Consumer telepresence robots (like the Double or Beam), on the other hand, cost much less but can’t create a sense of immersion—you’re staring at a video feed on a computer screen, after all.

Now a team of roboticists at the University of Pennsylvania is using affordable sensors and actuators and virtual reality technologies like Oculus Rift to build a platform that offers the same capabilities of a high-end telepresence system at a reasonable cost. DORA (Dexterous Observational Roving Automaton) attempts to bring true immersion to teleoperated robots, by precisely tracking the motion of your head in all six degrees of freedom and then duplicating those motions on a real robot moving around the real world. Their goal is making the experience so immersive that, while operating the robot at a remote place, you’ll forget that you’re not actually there.

When you put on a virtual reality system like the Oculus Rift, you’re looking at a virtual world that’s being rendered inside of a computer just for you. As the sensors on the system track the motions of your head in near-real time, software updates the images displayed in front of your eyes at a frame rate higher than you can detect. When this works, it works pretty well, and (in my experience) it’s not at all difficult to convince yourself that you’re temporarily inside a simulation or game.

However, if all the pieces of the system aren’t working together flawlessly, it’s very easy to tell that something is off. Best case, it breaks the immersion. Worst case, it breaks the immersion and you get sick. The problem here is that our eyes and brains and inner ears and all of the other sensors that we use to perceive the world have incredibly high standards for realism, and they’re very good at letting us know when something isn’t quite right.

This is the reason that immersive telepresence is so difficult: it has to be close to perfect. It’s not particularly difficult to create a telepresence robot on a pan/tilt head that can stream video and mimic your basic movements, but that level of sophistication isn’t going to fool your brain into an immersive experience. For immersion, you need something a lot more complicated, and that’s what DORA is trying to accomplish.
“DORA is based upon a fundamentally visceral human experience—that of experiencing a place primarily through the stimulation of sight and sound, and that of social interaction with other human beings, particularly with regards to the underlying meaning and subtlety associated with that interaction. At its core, the DORA platform explores the question of what it means to be present in a space, and how one’s presence affects the people and space around him or her.”
Er, yeah. What they said.

Let’s take a look at how DORA works. The Oculus headset tracks both orientation (via an IMU) and position (using infrared beacon tracking). Positional data are sent wirelessly to Arduinos and Intel Edison microcontrollers on the robot, which responds in pitch, roll, yaw, and x, y, and z dimensions.
DORA’s cameras each stream back 976 x 582 video at 30 frames per second, which is a bit below what the Oculus can handle in both resolution and frame rate. This is mainly a budgetary constraint at this point (DORA is still a prototype, remember), but if the researchers upgrade their cameras, the overall system would have no trouble handling it.

With an immersive system like this, one of the biggest concerns is latency: when you move your head, you expect what you see with your eyes to change. If the delay between these two things is too large, it can result in an experience that can be anywhere between uncomfortable and full-on puketastic.

With a VR system like the Oculus, sensors have to register that you’ve moved your head and send that information to your computer, your computer has to calculate how much your view has changed and how and then render the next frame of animation, and then the screen on the headset has to display that imagery. Oculus cites early VR research to suggest that 60 milliseconds is “an upper limit for acceptable VR… [but] most people agree that if latency is below 20 ms, then the lag is no longer perceptible.”

DORA’s main challenge is that it not only has to deal with a wireless connection, but it also has to deal with the constraints imposed by the movement of mechanical parts. Its creators say that in typical use they’ve measured a latency of about 70 milliseconds. This is still slightly high, the team admits, but it’s not bad, and we’d expect that there’s additional optimization that could be done.

At the moment, DORA is operating over a radio link with a line of sight range of about 7 kilometers. This is what DORA was designed for, but if it ends up working in museums or as a remote platform for first responders, it’ll obviously have to transition to Wi-Fi or a 4G connection. This will of course introduce additional latency, but the DORA team expects that as wireless infrastructure improves over the coming years, it won’t take long for it to reach a point where DORA could be accessed from nearly anywhere.

DORA’s creators, a team of UPenn seniors (John C. Nappo, Daleroy Sibanda, Emre Tanırgan, and Peter A. Zachares) led by robotics professor Vijay Kumar, say they’ve tested the system with 50 volunteers, and only three reported some sort of motion sickness, which seems about on par with a typical Oculus experience.
We haven’t tried out DORA for ourselves (yet), but Emre describes the experience of telepresence immersion like this:
You feel like you are transported somewhere else in the real world as opposed to a simulated environment. You actually get to see people interacting with you, as if you were actually there, and it’s difficult to recreate the same experience in a virtual environment right now. You could argue that real time computer graphics used for VR will soon catch up to the level of realism of our world, but another key difference is the fact that everything is live. There is an inherent unpredictability with a system like this where you could go somewhere in the real world and not know what could possibly happen when you’re there.
We’ve been told that there’s potential for a crowdfunding campaign and formal product launch, but only that the team doesn’t yet have any concrete plans. Initial markets would likely include relatively controlled, semi-structured environments like museums, followed by applications for specialists like emergency responders, and eventually DORA would become available for consumers to play with, and that, of course, is what we’re selfishly most interested in.

Telepresence companies like Suitable Technologies are already testing out remote presence experiences, like museum visits for people with disabilities. With DORA, you could have a similar experience except while feeling as if you were really there, instead of just looking at a moving picture inside of a box. It may not be able to replace the real world (not yet), but it’s getting closer, and a system like DORA is the way to do it.
[ DORA ]
Special thanks to John, Peter, Daleroy, and Emre for speaking with us.

Wednesday, April 29, 2015

VR at the Pong level

Chet Faliszek on virtual reality gaming: We're at the Pong level, we've only scratched the surface

Already hugely impressive, virtual reality is only now at the stage video games were when Pong was released in 1972, says Left 4 Dead, Portal and Half-Life writer Chet Faliszek.
Speaking at the Slush Play virtual reality (VR) conference in Reykjavik, the Valve video game writer gave advice on what he expects to see from the technology in 2015 - and said that the honest answer is, no one really knows.

"None of know what the hell we are doing. We're still just scratching the surface of VR. We still haven't found out what VR is, and that's fine. We've been making movies in pretty much the same way for 100 years, TV for 60 years and videogames for 40. VR has only really been [in development] for about a year, so we're at Pong level."

In its first year, Slush Play is a spin-off of the popular Slush conference, which takes place annually in Helsinki to celebrate Nordic technology startups. Instead of being open to all forms of tech startup, Slush Play is focused on young companies in the video game and VR industry.

Faliszek likened forcing VR into the style of video games we already play to fixing a rudder to an early car, because that's how steering had always worked. He says entire game genres could be changed and rewritten to fit with what VR is capable of, and cited attempts to bring VR to Grand Theft Auto 5 as examples of going about VR development in the wrong way.

Locomotion is a real problem

"Just because a game genre has been around for 35 years doesn't mean it'll work with VR. How do you move around in VR? Locomotion is a real problem. Or you might find out that that genre shouldn't exist anymore. It doesn't work."

The video game writer added that VR, which provides a 360-degree world completely surrounding the player, "fundamentally changes the way you interact and experience the world. VR actually changes the game and experience - embrace that and experiment with that.

"We can get into gamers' heads in ways we never have before. The feeling of vulnerability has never been higher. You aren't looking at the action, you're in it and you can't escape it."

But there's one thing that VR game developers must be careful to avoid, and that is motion sickness and nausea. "There's one thing you can't do and that's make people sick," Faliszek said. "It has to run at 90 frames per second. Any lower and people feel sick."

Putting a nose on the screen isn't the answer

Thwarting a recent suggestion that adding a virtual nose to what the player sees stops them feeling sick by giving them a fixed and familiar point of reference, Faliszek said: "Putting a nose on the screen isn't the answer, when you do it right nobody gets sick.
"Telling people they will be ok 'Once you get your VR legs' is a wholly wrong idea. If people need to get used to it then that's failure."

The excitement surround VR at Slush Play is positively palpable. Even after eight hours of talks and fireside chats, the audience of developers and investors were keen to tell each other just how fundamentally the technology will change the gaming landscape. One attendee was overheard saying VR was "everything I dreamed of as a kid." Another told a colleague: "There have been more updates in VR in the past two years than in the 15 before that."

Friday, April 24, 2015

Monday, April 20, 2015

Cancer blood test


 Many developments have been seen in cancer diagnosis, and the new technique developed by a team of researchers is very simple and easy technique.
They have developed a technique called liquid biopsy, which is a blood test that will diagnose cancer.

The working of this technique is very simple; it will detect the tiny snippets of cancer DNA in the patient’s blood.

In this technique where only blood sample is taken for diagnosis is very safe than the traditional cancer diagnosis techniques like the traditional biopsy or a CT scan.

This technique could help oncologist to get quick estimates if the treatment is working on the patient and it is a very simple way to monitor the cancer treatment, and they can easily identify developments of the treatment like if the cancer cells has developed resistance to the treatment etc.

Researchers said, “This could change forever the way we follow up not only response to treatments but also the emergence of resistance.”

Researchers said that the extensive evaluations are needed to confirm the efficacy of the blood test, but they say that they conducted small studies on considering colon, lung and blood cancer, and the earlier results are quite encouraging.

For the study they took 126 patients who are suffering from lymphoma and the researchers were able to predict the recurrence even faster than by the traditional CT scan.

The liquid biopsies could also help researchers identify the patients who are more likely to respond to therapy.

Researchers said, “Every cancer has a mutation that can be followed with this method, it is like bar coding the cancer in the blood.”

The standard detection techniques for cancer to determine the effectiveness of the treatment are quite vague as it rely on the improvement of patients symptoms, but these improvements vary with every individual, so it is not much reliable.

Sometimes it also happens that the doctors think that the tumor is present when it has gone.
Experts say, “When you are treating a patient — and we see this many times — your treatment is quite effective but there is some residual lesion on a scan, you take the patient to surgery for a biopsy, and all you see is scar tissue. There is no visible cancer there.”

The blood test is reliable and helps the researchers to clearly identify if the treatment is working on the patient or not.

Mandelbrot zoom


StartsWithABang writes You're used to real numbers: that is, numbers that can be expressed as a decimal, even if it's an arbitrarily long, non-repeating decimal. There are also complex numbers, which are numbers that have a real part and also an imaginary part. The imaginary part is just like the real part, but is also multiplied by i, or the square root of -1. It's a simple definition: the Mandelbrot set consists of every possible complex number, n, where the sequence n, n^2 + n, (n^2 + n)^2 + n, etc.—where each new term is the prior term, squared, plus n—does not go to either positive or negative infinity. The scale of zoom visualizations now goes well past the limits of the observable Universe, with no signs of loss of complexity at all.

Thursday, April 16, 2015

Immune system to kill all cancers

Scientists find key to 'turbo-charging' immune system to kill all cancers

By , Science Editor
A protein which ‘turbo-charges’ the immune system so that it can fight off any cancer or virus has been discovered by scientists.
In a breakthrough described as a ‘game-changer’ for cancer treatment, researchers at Imperial College found a previously unknown molecule which boosts the body’s ability to fight off chronic illnesses.
Scientists at Imperial College London, who led the study, are now developing a gene therapy based on the protein and hope to begin human trials in three years.
“This is exciting because we have found a completely different way to use the immune system to fight cancer,” said Professor Philip Ashton-Rickardt, from the Section of Immunobiology in the Department of Medicine at Imperial, who led the study.
“It could be a game-changer for treating a number of different cancers and viruses.
“This is a completely unknown protein. Nobody had ever seen it before or was even aware that it existed. It looks and acts like no other protein.”

The protein – named lymphocyte expansion molecule, or LEM, promotes the spread of cancer killing ‘T cells’ by generating large amounts of energy.

Normally when the immune system detects cancer it goes into overdrive trying to fight the disease, flooding the body with T cells. But it quickly runs out of steam.

However the new protein causes a massive energy boost which makes T cells in such great numbers that the cancer cannot fight them off.

It also causes a boost of immune memory cells which are able to recognise tumours and viruses they have encountered previously so there is less chance that they will return.

The team made the discovery while screening mice with genetic mutations. They found one type produced ten times the number of cancer-fighting T cells, suppressing infections and becoming resistant to cancer.

Researchers found that the mice with enhanced immunity produced high levels of the unknown protein which is also found in humans.

They are hoping to produce a gene therapy whereby T cells of cancer patients could be enhanced with the protein and then injected back into the body. It could end the need for harsh chemotherapies as the body itself would be fighting the disease, rather than toxic drugs.

Dr Mike Turner, Head of Infection and Immunobiology at The Wellcome Trust, said: “The discovery of a protein that could boost the immune response to not only cancer, but also to viruses, is a fascinating one.

“Further investigation in animal models is needed before human trials can commence, but there is potential for a new type of treatment that capitalises on the immune system’s innate ability to detect and kill abnormal cells.”

Charities said the protein showed 'great promise' and were eager to see if it could be translated into humans.

Dr Alan Worsley, senior science information officer at Cancer Research UK, said: “This exciting work in mice is still at an early stage and only looked at one type of cancer.

“Cancer often finds a way to suppress the immune system, but drugs that overcome this and allow immune cells to target cancer show great promise. Research into the biology of the immune system could help develop more effective treatments by increasing the number of cancer-killing immune cells.

“The researchers now need to figure out how to develop drugs that target this molecule, and whether doing so would be safe and effective in cancer patients.”
The research was published in the journal Science.

Monday, April 13, 2015

Spain hologram protest

Late last year the Spanish government passed a law that set extreme fines for protesters convening outside of government buildings.

In response to the controversial Citizen Safety Law, which will take effect on July 1, Spanish activists have staged the world's first ever virtual political demonstration.

After months of massive flesh-and-blood protests against the so-called 'gag law', thousands of holograms last night marched in front of the Spanish parliament in Madrid.
Organised by the group Holograms for Freedom, ghost-like figures holding placards took aim at the imminent draconian measures, arguing that holographic people are now afforded greater freedoms than their real-life counterparts.

The 'NoSomosDelito' (meaning: 'We are not crime') movement - composed of more than 100 different organisations - called upon sympathisers around the world to participate in the landmark even by simply webcamming their face via the campaign website.
More than 2,000 virtual images were sent and used in the hour-long hologram demonstration, El Pais reported.

The following video comes from Euronews.

Under the Citizens Safety Law, it is illegal to gather in front of government buildings without permission from authorities; this includes everything from universities to hospitals.
Organisers of unauthorised demonstrations could be fined up to €600,000, with further €600 fines for disrespecting police officers, and €30,000 for filming or photographing them.

In a video promoting the protest, a spokeswoman said: "With the passing of the Gag Law, you won't be allowed to assemble in public spaces without risking a fine.
"Ultimately, if you are a person, you won't be allowed to express yourself freely. You will only be able to do it if you are a hologram."

Spokesman Carlos Escano told Spanish newspaper El Mundo: "Our protest with holograms is ironic.
"With the restrictions we're suffering on our freedoms of association and peaceful assembly, the last options that will be left to use in the end will be to protest through our holograms."

Cure for color blindness

For the more than 10 million Americans with colorblindness, there’s never been a treatment, let alone a cure, for the condition that leaves them unable to distinguish certain hues.
Now, for the first time, two University of Washington professors have teamed with a California biotech firm to develop what they say may be a solution: a single shot in the eye that reveals the world in full color.

Jay and Maureen Neitz, husband-and-wife scientists who have studied the vision disorder for years, have arranged an exclusive license agreement between UW and Avalanche Biotechnologies of Menlo Park. Together, they’ve found a new way to deliver genes that can replace missing color-producing proteins in certain cells, called cones, in the eyes.

“I don’t think there’s any question that it will work,” said Maureen Neitz, 57, a UW professor of ophthalmology.

The new treatment — which may be tested in humans within two years — could be a boon for the 1 in 12 men and 1 in 230 women with color-vision deficiency.

The trouble occurs when people are born without one or more of the three types of color-sensing proteins normally present in the cones of the retina. The most common type is red-green colorblindness, followed by blue-yellow colorblindness. A very small proportion of the population is completely colorblind, seeing only shades of gray.

Because they can’t perceive certain colors, they see hues in muted or different shades than people with normal vision.

Brian Chandler, 38, of Seattle, said he first noticed he was colorblind in seventh grade, when he started getting C’s and D’s on drawings in science class.
“I was coloring green stuff brown and brown stuff green,” recalled Chandler, a traffic-safety engineer.

Colorblindness is often a genetic disorder. It affects mostly men, who can inherit a mutation on the X chromosome that impairs their perception of red and green. A much smaller fraction of cases are in women, who have two X chromosomes, which gives them a better chance of avoiding effects of any genetic defect.

Most people think of colorblindness as an inconvenience or mild disability, mainly causing problems with unmatched shirts and socks. But the Neitzes say the condition can have profound impacts — limiting choices for education or careers, making driving dangerous, and forcing continual adaptation to a world geared for color vision.

“There are an awful lot of people who feel like their life is ruined because they don’t see color,” said Jay Neitz, 61, the professor of ophthalmology who confirmed in 1989 that dogs are colorblind, too.
People may not qualify as commercial pilots, for instance, if they’re colorblind. Other careers that can be limited include those of chefs, decorators, electricians and house painters, all of which require detailed color vision.

The Neitzes have focused on the disorder for years, first proving in 2009 they could use gene therapy to correct colorblindness in male squirrel monkeys, which are born unable to distinguish between red and green.

In the journal Nature, they reported the success of a technique that inserted the human form of a gene that detects red color into a viral shell, and then injected it behind the retinas of two squirrel monkeys.
The monkeys, named Sam and Dalton — the latter after the British chemist John Dalton, who was the first to analyze and report on his own color-vision deficiency — had been trained to recognize colors on a computer screen in exchange for a reward of grape juice. Before the surgery, they couldn’t detect certain hues, while after the procedure they got them right nearly every time.

But that technique is risky, requiring surgery, so the Neitzes were looking for another way to do the job.

“For 10 years, we have been trying to figure out a way to get the genes to go to the back of the eye with a simple shot,” said Neitz.

Now, with the help of Avalanche, the researchers say they’ve developed a technique that does just that. It uses a safe vector, called an adeno-associated virus, to house the pigment gene, which is injected directly into the vitreous, the jellylike center of the eye. Once, there, it targets cells on the back of the retina, said Thomas W. Chalberg Jr., the co-founder and chief executive of the firm.
“It’s a protein shell, kind of like a Trojan horse, that gets you entry into the cell. Once you’re there, the DNA gets to set up shop and produce the photo pigment of interest,” he said. Avalanche has two drug candidates, AVA-322 and AVA-323, that carry pigment-producing genes.

It takes only 30 percent of the cells to be transduced, or changed, to put the world in a whole new hue, Jay Neitz said. Early tests show the technique meets that mark in monkeys.

After preclinical trials are complete, Chalberg said he hopes to move to human trials within one to two years and then seek federal Food and Drug Administration approval for the treatment. Eventually, the treatment could be offered during a single visit to an opthalmologist’s office.
Such a development would be “an amazing advantage,” said Dr. Rohit Varma, a professor of ophthalmology and director of the Eye Institute at the University of Southern California, who is not involved in the research.

“It would cure or at least help people who are colorblind,” he said. “This is the first hope, in many ways, for these individuals that suffer from this.”

While noting that many tests which succeeded in animals have later failed in humans, he said he’s cautiously optimistic the trials will deliver as promised. Plus, he said, it will be important to learn whether the therapy not only adds the color-sensing ability, but actually improves the lives of those who are treated.

That’s a thought echoed by Dr. Paul Sternberg Jr., chairman of the Vanderbilt Eye Institute at Vanderbilt University in Nashville and a clinical spokesman for the American Academy of Ophthalmology. He called the Neitzes “world-class scientists” and said the wider field awaits potential human trials of the technique.

“The brain develops a certain way of seeing,” he said. “We don’t know whether replacing the visual pigment in a 25-year-old colorblind man will allow him to see in full color.”
Already there appears to be high interest in finding out. Since March 25, more than 10,000 people have visited a new website associated with the project,, including many who hope to be the first cured of the condition.

“I definitely would be interested,” said David Curry, 33, of Port Townsend, who had to abandon a dream of becoming a commercial helicopter pilot because of his deuteranopia, or red-green colorblindness. “I’d want to speak to Professor Neitz more and learn about the process more before putting my eyes on the chopping block, so to speak. But I’ve always wondered what it would look like to see color like everyone else.”

Brian Chandler, also colorblind, isn’t so sure. He said he’s learned to adapt to the world using cues other than color to get along.

“I have mixed feelings about ‘curing’ my color-vision deficiency,” said Chandler. “On one hand, I’m nervous about the change, since I’ve seen like this my entire life. On the other, it would potentially be exciting to see things I had not seen before.”

For their part, the Neitzes say they’re eager to see a lifetime of work put into clinical practice. The technique to correct colorblindness also might eventually be used for other cone-based disorders, including retinitis pigmentosa, an inherited disorder that can lead to blindness.
Curing colorblindness, though, could affect millions who would like to know what they’re missing, the scientists said.

“There’s nobody with a black-and-white TV who, if you said, ‘Would you like color TV?’ wouldn’t trade it,” Jay Neitz said.

Sunday, April 12, 2015

South Carolina shooting


Walter L. Scott (February 9, 1965 – April 9, 2015),[3] a 50-year-old black man, served two years in the U.S. Coast Guard before being given a general discharge for a drug-related incident.[4] He was a forklift operator, studying massage therapy, and the father of four children.[5][4][6] Weeks before the shooting, he became engaged to marry his long-time girlfriend.[7]
After the shooting, examination of Scott's police record indicated ten arrests, mostly for contempt of court regarding failure to pay child support or to appear for court hearings. He was also arrested in 1987 on an assault and battery charge, and convicted in 1991 of possession of a bludgeon.[1]


Michael Thomas Slager, a 33-year-old white police officer, and native of New Jersey, served in the North Charleston Police Department (NCPD) for five years and five months prior to the shooting. Prior to becoming a police officer he too served in the U.S. Coast Guard.[8] At the time of the shooting, Slager's wife was eight months' pregnant with their first child.[8]
Slager was named in a police complaint in 2013 after he allegedly "tased a man for no reason". Slager was cleared in that incident, although the victim and several witnesses said they were never interviewed. North Charleston police said they would now review that case.[9] In another complaint in January, he was cited for failing to file a report after an African-American woman called police because her children were being harassed. Personnel documents describe Slager as having demonstrated "great officer safety tactics" in dealing with suspects, and noted his proficiency with a Taser.[8]


Walter Scott owed more than $18,000 in child-support payments and had a bench warrant for his arrest when he was fatally shot by a South Carolina police officer, according to court documents obtained by NBC News.
Scott's parents have suggested they believe their son fled from the officer, Michael Slager, because he owed back child-support payments and did not want to be arrested again. 

Scott owed a total of $18,104 in back child-support, the documents obtained by NBC News show. His last payment was on July 20, 2012, according to the paperwork. The bench warrant for his arrest had been active since a January 16, 2013, court hearing. At that time, Scott had owed $7,836 — but the amount had increased to more than $18,000 at the time of his death. 

The information in the documents appeared to contradict an Associated Press report early Friday. Citing court records, the AP had reported that no bench warrant had been issued for Scott. It also reported that he owed nearly $7,500 in child-support payments.
In an interview with TODAY, Scott's parents explained why they believe their son bolted from Slager before the deadly shooting. 

"I believe he didn't want to go to jail again," Walter Scott Sr. told TODAY. "He just ran away." 


The incident begins in the audio at about the 7:30 mark of the 30-minute recording as Slager calls in a traffic stop. He was pulling over Scott’s Mercedes-Benz for a broken light, police said.

Slager can be heard at about the 10:35 mark of the recording calling dispatch to announce he’s in a foot chase, describing the suspect as black in a green shirt and blue pants. The dispatcher then repeats his description and calls for radio silence other than transmissions related to the chase.

At about 11:05 of the recording, another officer says he’s in route to join the chase. Slager then tells the other officers of his new location and can be overheard telling someone to “get down on the ground.”

At 12:27 (9:38 a.m.), as the other officers try to find Slager, he says, “shots fired. Subject is down. He grabbed my Taser.”

Slager then says a minute later that he needs his vehicle secured. He says the suspect has gun shot wounds to the chest, thigh and buttocks and is unresponsive. He tells the dispatcher the scene, behind a pawn shop in a field, is secure.

Another officer, Clarence Habersham, then arrives and confirms the injuries to the victim. He says over the radio that they’ve started to provide first aid, including chest compressions. An EMS unit arrives at the scene at about the 19:43 mark of the tape, about six minutes after Slager called in the shooting.

In statements made before the video was released, Slager claimed he feared for his life after Scott took his Taser from him.

Police said Scott’s Mercedes-Benz sedan was stopped because it had a broken brake light, according to the newspaper. Scott ran away from Slager, who chased him. During the chase, Scott confronted Slager, his attorney said in a statement. Slager took out his Taser, but he said Scott took the device during the struggle, overpowering the officer and making him fear for his life.

Slager said he then fired at Scott because he “felt threatened,” Slager’s attorney said Monday in a statement.