WAIMS

Your Gateway to Knowledge and Innovation

Advertisement
2nd Asia Africa ICT & Development Summit,Expo and Awards 2018

ROLE OF ICT IN ATTAINING SUSTAINABLE DEVELOPMENT: GLOBAL FORUM

The ICT sector has established to be a strong driver of GDP growth in nations across the globe. From developing countries, to developed nations, the ICT sector has contributed to the success of each of these country’s economies, the development of its people’s skills and potential and positioning the country as a place for international organization to more efficiently do business.

The expansion of information and communications technology and worldwide connectivity   has huge possibility to speed up human growth, to connect the digital divide and to develop knowledge societies. ICT is competent for boosting the means of execution of long term development goals by nurturing global collaboration and coordination, encouragement of technology transfer and capacity building, strengthening multi-stakeholder partnerships, facilitating data monitoring and accountability. ICT provides a vital role in attaining the sustainable development, particularly in rising information and knowledge societies.

ICT are open access to academic research, clearness in making informed decisions, and platforms for online partnership for co-creation, learning and work. By increasing access to information and knowledge, ICT minimizes difference within and among nation. This makes possible social and economic progress, even to disadvantaged part of society, such as persons with disabilities. There is no doubt that access to developed technologies has grown so fast and yet the remarkable gains are still in weak position by existing gap in ICT access between and within countries, between urban and rural settings, among men and women, and boys and girls. A major digital divide is still in place, with more people offline than online and particularly poor access in vulnerable developing nations. The challenge now is to bring the rest of the world online and ensuring that no one is left behind.

This Is How Statue Of Unity Looks From Space. See First Image

An American company has taken the lead to capture a stunning image of the world’s largest statue from space. The 597-feet Statue of Unity of India’s “Iron Man” Sardar Vallabh Bhai Patel has been shared by the American Constellation of Satellites owned by Sky Lab.

The image shows the statue from top angle with River Narmada flowing near it.

Incidentally, in 2017, when Indian Space Research Organisation made a world record of launching 104 satellites together, there were 88 earth imaging Dove Satellites from the same American company inboard the PSLV.

Sardar Vallabh Bhai Patel’s statue was unveiled by Prime Minister Narendra Modi at Kevadiya in Gujarat on October 31. The statue is 177 feet higher than China’s Spring Temple Buddha, which was the tallest statue till now. It is twice as tall as the Statue of Liberty in New York and was built at an estimated cost of Rs. 2,989 crore.

Life on Mars?

Life on Mars?

It’s hard enough to identify fossilized microbes on Earth. How would we ever recognize them on Mars?

image: https://thumbs-prod.si-cdn.com/BgFwa2O8hlkIGs3Sw4jErblAgX8=/800×600/filters:no_upscale()/https://public-media.smithsonianmag.com/filer/mars_img.jpg

mars_img.jpg

A Martian meteorite fueled speculation and debate in 1996 when scientists reported that it held signs of past life. The search now moves to Mars itself. (NASA)
SMITHSONIAN MAGAZINE | SUBSCRIBE

On August 7, 1996, reporters, photographers and television camera operators surged into NASA headquarters in Washington, D.C. The crowd focused not on the row of seated scientists in NASA’s auditorium but on a small, clear plastic box on the table in front of them. Inside the box was a velvet pillow, and nestled on it like a crown jewel was a rock—from Mars. The scientists announced that they’d found signs of life inside the meteorite. NASA administrator Daniel Goldin gleefully said it was an “unbelievable” day. He was more accurate than he knew.

The rock, the researchers explained, had formed 4.5 billion years ago on Mars, where it remained until 16 million years ago, when it was launched into space, probably by the impact of an asteroid. The rock wandered the inner solar system until 13,000 years ago, when it fell to Antarctica. It sat on the ice near AllanHills until 1984, when snowmobiling geologists scooped it up.

Scientists headed by David McKay of the JohnsonSpaceCenter in Houston found that the rock, called ALH84001, had a peculiar chemical makeup. It contained a combination of minerals and carbon compounds that on Earth are created by microbes. It also had crystals of magnetic iron oxide, called magnetite, which some bacteria produce. Moreover, McKay presented to the crowd an electron microscope view of the rock showing chains of globules that bore a striking resemblance to chains that some bacteria form on Earth. “We believe that these are indeed microfossils from Mars,” McKay said, adding that the evidence wasn’t “absolute proof” of past Martian life but rather “pointers in that direction.”

Among the last to speak that day was J. William Schopf, a University of California at Los Angeles paleobiologist, who specializes in early Earth fossils. “I’ll show you the oldest evidence of life on this planet,” Schopf said to the audience, and displayed a slide of a 3.465 billion-year-old fossilized chain of microscopic globules that he had found in Australia. “These are demonstrably fossils,” Schopf said, implying that NASA’s Martian pictures were not. He closed by quoting the astronomer Carl Sagan: “Extraordinary claims require extraordinary evidence.”

Despite Schopf’s note of skepticism, the NASA announcement was trumpeted worldwide. “Mars lived, rock shows Meteorite holds evidence of life on another world,” said the New York Times. “Fossil from the red planet may prove that we are not alone,” declared The Independent of London.

Over the past nine years, scientists have taken Sagan’s words very much to heart. They’ve scrutinized the Martian meteorite (which is now on view at the Smithsonian’s National Museum of Natural History), and today few believe that it harbored Martian microbes.

The controversy has prompted scientists to ask how they can know whether some blob, crystal or chemical oddity is a sign of life—even on Earth. Adebate has flared up over some of the oldest evidence for life on Earth, including the fossils that Schopf proudly displayed in 1996. Major questions are at stake in this debate, including how life first evolved on Earth. Some scientists propose that for the first few hundred million years that life existed, it bore little resemblance to life as we know it today.

NASA researchers are taking lessons from the debate about life on Earth to Mars. If all goes as planned, a new generation of rovers will arrive on Mars within the next decade. These missions will incorporate cutting-edge biotechnology designed to detect individual molecules made by Martian organisms, either living or long dead.

The search for life on Mars has become more urgent thanks in part to probes by the two rovers now roaming Mars’ surface and another spaceship that is orbiting the planet. In recent months, they’ve made a series of astonishing discoveries that, once again, tempt scientists to believe that Mars harbors life—or did so in the past. At a February conference in the Netherlands, an audience of Mars experts was surveyed about Martian life. Some 75 percent of the scientists said they thought life once existed there, and of them, 25 percent think that Mars harbors life today.

The search for the fossil remains of primitive single- celled organisms like bacteria took off in 1953, when Stanley Tyler, an economic geologist at the University of Wisconsin, puzzled over some 2.1 billion-year-old rocks he’d gathered in Ontario, Canada. His glassy black rocks known as cherts were loaded with strange, microscopic filaments and hollow balls. Working with Harvard paleobotonist Elso Barghoorn, Tyler proposed that the shapes were actually fossils, left behind by ancient life-forms such as algae. Before Tyler and Barghoorn’s work, few fossils had been found that predated the Cambrian Period, which began about 540 million years ago. Now the two scientists were positing that life was present much earlier in the 4.55 billion-year history of our planet. How much further back it went remained for later scientists to discover.

In the next decades, paleontologists in Africa found 3 billion- year-old fossil traces of microscopic bacteria that had lived in massive marine reefs. Bacteria can also form what are called biofilms, colonies that grow in thin layers over surfaces such as rocks and the ocean floor, and scientists have found solid evidence for biofilms dating back 3.2 billion years.

But at the time of the NASA press conference, the oldest fossil claim belonged to UCLA’s William Schopf, the man who spoke skeptically about NASA’s finds at the same conference. During the 1960s, ’70s and ’80s, Schopf had become a leading expert on early life-forms, discovering fossils around the world, including 3 billion-year-old fossilized bacteria in South Africa. Then, in 1987, he and some colleagues reported that they had found the 3.465 billion-yearold microscopic fossils at a site called Warrawoona in the Western Australia outback—the ones he would display at the NASA press conference. The bacteria in the fossils were so sophisticated, Schopf says, that they indicate “life was flourishing at that time, and thus, life originated appreciably earlier than 3.5 billion years ago.”

Since then, scientists have developed other methods for detecting signs of early life on Earth. One involves measuring different isotopes, or atomic forms, of carbon; the ratio of the isotopes indicates that the carbon was once part of a living thing. In 1996, a team of researchers reported that they had found life’s signature in rocks from Greenland dating back 3.83 billion years.

The signs of life in Australia and Greenland were remarkably old, especially considering that life probably could not have persisted on Earth for the planet’s first few hundreds of millions of years. That’s because asteroids were bombarding it, boiling the oceans and likely sterilizing the planet’s surface before about 3.8 billion years ago. The fossil evidence suggested that life emerged soon after our world cooled down. As Schopf wrote in his book Cradle of Life, his 1987 discovery “tells us that early evolution proceeded very far very fast.”

A quick start to life on Earth could mean that life could also emerge quickly on other worlds—either Earth-like planets circling other stars, or perhaps even other planets or moons in our own solar system. Of these, Mars has long looked the most promising.

The surface of Mars today doesn’t seem like the sort of place hospitable to life. It is dry and cold, plunging down as far as -220 degrees Fahrenheit. Its thin atmosphere cannot block ultraviolet radiation from space, which would devastate any known living thing on the surface of the planet. But Mars, which is as old as Earth, might have been more hospitable in the past. The gullies and dry lake beds that mark the planet indicate that water once flowed there. There’s also reason to believe, astronomers say, that Mars’ early atmosphere was rich enough in heat-trapping carbon dioxide to create a greenhouse effect, warming the surface. In other words, early Mars was a lot like early Earth. If Mars had been warm and wet for millions or even billions of years, life might have had enough time to emerge. When conditions on the surface of Mars turned nasty, life may have become extinct there. But fossils may have been left behind. It’s even possible that life could have survived on Mars below the surface, judging from some microbes on Earth that thrive miles underground.

When Nasa’s Mckay presented his pictures of Martian fossils to the press that day in 1996, one of the millions of people who saw them on television was a young British environmental microbiologist named Andrew Steele. He had just earned a PhD at the University of Portsmouth, where he was studying bacterial biofilms that can absorb radioactivity from contaminated steel in nuclear facilities. An expert at microscopic images of microbes, Steele got McKay’s telephone number from directory assistance and called him. “I can get you a better picture than that,” he said, and convinced McKay to send him pieces of the meteorite. Steele’s analyses were so good that soon he was working for NASA.

Ironically, though, his work undercut NASA’s evidence: Steele discovered that Earthly bacteria had contaminated the Mars meteorite. Biofilms had formed and spread through cracks into its interior. Steele’s results didn’t disprove the Martian fossils outright—it’s possible that the meteorite contains both Martian fossils and Antarctic contaminants— but, he says, “The problem is, how do you tell the difference?” At the same time, other scientists pointed out that nonliving processes on Mars also could have created the globules and magnetite clumps that NASA scientists had held up as fossil evidence.

But McKay stands by the hypothesis that his microfossils are from Mars, saying it is “consistent as a package with a possible biological origin.” Any alternative explanation must account for all of the evidence, he says, not just one piece at a time.

The controversy has raised a profound question in the minds of many scientists: What does it take to prove the presence of life billions of years ago? in 2000, oxford paleontologistMartin Brasier borrowed the original Warrawoona fossils from the NaturalHistoryMuseum in London, and he and Steele and their colleagues have studied the chemistry and structure of the rocks. In 2002, they concluded that it was impossible to say whether the fossils were real, essentially subjecting Schopf’s work to the same skepticism that Schopf had expressed about the fossils from Mars. “The irony was not lost on me,” says Steele.

In particular, Schopf had proposed that his fossils were photosynthetic bacteria that captured sunlight in a shallow lagoon. But Brasier and Steele and co-workers concluded that the rocks had formed in hot water loaded with metals, perhaps around a superheated vent at the bottom of the ocean—hardly the sort of place where a sun-loving microbe could thrive. And microscopic analysis of the rock, Steele says, was ambiguous, as he demonstrated one day in his lab by popping a slide from the Warrawoona chert under a microscope rigged to his computer. “What are we looking at there?” he asks, picking a squiggle at random on his screen. “Some ancient dirt that’s been caught in a rock? Are we looking at life? Maybe, maybe. You can see how easily you can fool yourself. There’s nothing to say that bacteria can’t live in this, but there’s nothing to say that you are looking at bacteria.”

Schopf has responded to Steele’s criticism with new research of his own. Analyzing his samples further, he found that they were made of a form of carbon known as kerogen, which would be expected in the remains of bacteria. Of his critics, Schopf says, “they would like to keep the debate alive, but the evidence is overwhelming.”

The disagreement is typical of the fast-moving field. Geologist Christopher Fedo of George Washington University and geochronologist Martin Whitehouse of the Swedish Museum of Natural History have challenged the 3.83 billionyear- old molecular trace of light carbon from Greenland, saying the rock had formed from volcanic lava, which is much too hot for microbes to withstand. Other recent claims also are under assault. Ayear ago, a team of scientists made headlines with their report of tiny tunnels in 3.5 billion-year-old African rocks. The scientists argued that the tunnels were made by ancient bacteria around the time the rock formed. But Steele points out that bacteria might have dug those tunnels billions of years later. “If you dated the London Underground that way,” says Steele, “you’d say it was 50 million years old, because that’s how old the rocks are around it.”

Such debates may seem indecorous, but most scientists are happy to see them unfold. “What this will do is get a lot of people to roll up their sleeves and look for more stuff,” says MIT geologist John Grotzinger. To be sure, the debates are about subtleties in the fossil record, not about the existence of microbes long, long ago. Even a skeptic like Steele remains fairly confident that microbial biofilms lived 3.2 billion years ago. “You can’t miss them,” Steele says of their distinctive weblike filaments visible under a microscope. And not even critics have challenged the latest from Minik Rosing, of the University of Copenhagen’s Geological Museum, who has found the carbon isotope life signature in a sample of 3.7 billion-year-old rock from Greenland—the oldest undisputed evidence of life on Earth.

At stake in these debates is not just the timing of life’s early evolution, but the path it took. This past September, for example, Michael Tice and Donald Lowe of StanfordUniversity reported on 3.416 billion-year-old mats of microbes preserved in rocks from South Africa. The microbes, they say, carried out photosynthesis but didn’t produce oxygen in the process. A small number of bacterial species today do the same—anoxygenic photosynthesis it’s called—and Tice and Lowe suggest that such microbes, rather than the conventionally photosynthetic ones studied by Schopf and others, flourished during the early evolution of life. Figuring out life’s early chapters will tell scientists not only a great deal about the history of our planet. It will also guide their search for signs of life elsewhere in the universe—starting with Mars.

In January 2004, the NASA rovers Spirit and Opportunity began rolling across the Martian landscape. Within a few weeks, Opportunity had found the best evidence yet that water once flowed on the planet’s surface. The chemistry of rock it sampled from a plain called Meridiani Planum indicated that it had formed billions of years ago in a shallow, long-vanished sea. One of the most important results of the rover mission, says Grotzinger, a member of the rover science team, was the robot’s observation that rocks on Meridiani Planum don’t seem to have been crushed or cooked to the degree that Earth rocks of the same age have been— their crystal structure and layering remain intact. A paleontologist couldn’t ask for a better place to preserve a fossil for billions of years.

The past year has brought a flurry of tantalizing reports. An orbiting probe and ground-based telescopes detected methane in the atmosphere of Mars. On Earth, microbes produce copious amounts of methane, although it can also be produced by volcanic activity or chemical reactions in the planet’s crust. In February, reports raced through the media about a NASA study allegedly concluding that the Martian methane might have been produced by underground microbes. NASA headquarters quickly swooped in—perhaps worried about a repeat of the media frenzy surrounding the Martian meteorite—and declared that it had no direct data supporting claims for life on Mars.

But just a few days later, European scientists announced that they had detected formaldehyde in the Martian atmosphere, another compound that, on Earth, is produced by living things. Shortly thereafter, researchers at the European Space Agency released images of the Elysium Plains, a region along Mars’ equator. The texture of the landscape, they argued, shows that the area was a frozen ocean just a few million years ago—not long, in geological time. Afrozen sea may still be there today, buried under a layer of volcanic dust. While water has yet to be found on Mars’ surface, some researchers studying Martian gullies say that the features may have been produced by underground aquifers, suggesting that water, and the life-forms that require water, might be hidden below the surface.

Andrew Steele is one of the scientists designing the next generation of equipment to probe for life on Mars. One tool he plans to export to Mars is called a microarray, a glass slide onto which different antibodies are attached. Each antibody recognizes and latches onto a specific molecule, and each dot of a particular antibody has been rigged to glow when it finds its molecular partner. Steele has preliminary evidence that the microarray can recognize fossil hopanes, molecules found in the cell walls of bacteria, in the remains of a 25 million- year-old biofilm.

This past September, Steele and his colleagues traveled to the rugged Arctic island of Svalbard, where they tested the tool in the area’s extreme environment as a prelude to deploying it on Mars. As armed Norwegian guards kept a lookout for polar bears, the scientists spent hours sitting on chilly rocks, analyzing fragments of stone. The trip was a success: the microarray antibodies detected proteins made by hardy bacteria in the rock samples, and the scientists avoided becoming food for the bears.

Steele is also working on a device called MASSE (Modular Assays for Solar System Exploration), which is tentatively slated to fly on a 2011 European Space Agency expedition to Mars. He envisions the rover crushing rocks into powder, which can be placed into MASSE, which will analyze the molecules with a microarray, searching for biological molecules.

Sooner, in 2009, NASA will launch the Mars Science Laboratory Rover. It’s designed to inspect the surface of rocks for peculiar textures left by biofilms. The Mars lab may also look for amino acids, the building blocks of proteins, or other organic compounds. Finding such compounds wouldn’t prove the existence of life on Mars, but it would bolster the case for it and spur NASA scientists to look more closely.

Difficult as the Mars analyses will be, they’re made even more complex by the threat of contamination. Mars has been visited by nine spacecraft, from Mars 2, a Soviet probe that crashed into the planet in 1971, to NASA’s Opportunity and Spirit. Any one of them might have carried hitchhiking Earth microbes. “It might be that they crash-landed and liked it there, and then the wind could blow them all over the place,” says Jan Toporski, a geologist at the University of Kiel, in Germany. And the same interplanetary game of bumper cars that hurtled a piece of Mars to Earth might have showered pieces of Earth on Mars. If one of those terrestrial rocks was contaminated with microbes, the organisms might have survived on Mars—for a time, at least—and left traces in the geology there. Still, scientists are confident they can develop tools to distinguish between imported Earth microbes and Martian ones.

Finding signs of life on Mars is by no means the only goal. “If you find a habitable environment and don’t find it inhabited, then that tells you something,” says Steele. “If there is no life, then why is there no life? The answer leads to more questions.” The first would be what makes life-abounding Earth so special. In the end, the effort being poured into detecting primitive life on Mars may prove its greatest worth right here at home.

Read more: https://www.smithsonianmag.com/science-nature/life-on-mars-78138144/#ekb1QmLdMZ7b9cHx.99
Give the gift of Smithsonian magazine for only $12! http://bit.ly/1cGUiGv
Follow us: @SmithsonianMag on Twitter

The Ultimate Guide to Understanding Augmented Reality (AR) Technology

Introduction to Augmented Reality (AR)

Augmented RealityAugmented Reality (AR) may not be as exciting as a virtual reality roller coaster ride, but the technology is proving itself as a very useful tool in our everyday lives.

From social media filters, to surgical procedures, AR is rapidly growing in popularity because it brings elements of the virtual world, into our real world, thus enhancing the things we see, hear, and feel. When compared to other reality technologies, augmented reality lies in the middle of the mixed reality spectrum; between the real world and the virtual world.

What is Augmented Reality (AR)?

An enhanced version of reality where live direct or indirect views of physical real-world environments are augmented with superimposed computer-generated images over a user’s view of the real-world, thus enhancing one’s current perception of reality.

Would you like to learn more about commonly used words in Augmented Reality Terminology? Visit our Augmented Reality Glossary to discover 30+ must know industry terms. 
Augmented Reality Explained
A Simple Explanation of Augmented Reality (AR) :The origin of the word augmented is augment, which means to add or enhance something. In the case of Augmented Reality (also called AR), graphics, sounds, and touch feedback are added into our natural world to create an enhanced user experience.

Augmented Reality vs Virtual Reality : Unlike virtual reality, which requires you to inhabit an entirely virtual environment, augmented reality uses your existing natural environment and simply overlays virtual information on top of it. As both virtual and real worlds harmoniously coexist, users of augmented reality experience a new and improved natural world where virtual information is used as a tool to provide assistance in everyday activities.

Augmented Reality Explained

Applications of augmented reality can be as simple as a text-notification or as complicated as an instruction on how to perform a life-threatening surgical procedure. They can highlight certain features, enhance understandings, and provide accessible and timely data. Cell phones apps and business applications by companies using augmented realityare a few of the many applications driving augmented reality application development. The key point is that the information provided is highly topical and relevant to what you want you are doing.

Types of Augmented Reality

Augmented Reality (AR) Categories

Several categories of augmented reality technology exist, each with varying differences in their objectives and applicational use cases. Below, we explore the various types of technologies that make up augmented reality:

Marker Based Augmented Reality

Marker Based Augmented RealityMarker-based augmented reality (also called Image Recognition) uses a camera and some type of visual marker, such as a QR/2D code, to produce a result only when the marker is sensed by a reader. Marker based applications use a camera on the device to distinguish a marker from any other real world object. Distinct, but simple patterns (such as a QR code) are used as the markers, because they can be easily recognized and do not require a lot of processing power to read. The position and orientation is also calculated, in which some type of content and/or information is then overlaied the marker.

Markerless Augmented Reality

Markerless Augmented RealityAs one of the most widely implemented applications of augmented reality, markerless (also called location-based, position-based, or GPS) augmented reality, uses a GPS, digital compass, velocity meter, or accelerometer which is embedded in the device to provide data based on your location. A strong force behind markerless augmented reality technology is the wide availability of smartphones and location detection features they provide. It is most commonly used for mapping directions, finding nearby businesses, and other location-centric mobile applications.

Projection Based Augmented Reality

Projection Based Augmented RealityProjection based augmented reality works by projecting artificial light onto real world surfaces. Projection based augmented reality applications allow for human interaction by sending light onto a real world surface and then sensing the human interaction (i.e. touch) of that projected light. Detecting the user’s interaction is done by differentiating between an expected (or known) projection and the altered projection (caused by the user’s interaction). Another interesting application of projection based augmented reality utilizes laser plasma technology to project a three-dimensional (3D) interactive holograminto mid-air.

Superimposition Based Augmented Reality

Superimposition Based Augmented RealitySuperimposition based augmented reality either partially or fully replaces the original view of an object with a newly augmented view of that same object. In superimposition based augmented reality, object recognition plays a vital role because the application cannot replace the original view with an augmented one if it cannot determine what the object is. A strong consumer-facing example of superimposition based augmented reality could be found in the Ikea augmented reality furniture catalogue. By downloading an app and scanning selected pages in their printed or digital catalogue, users can place virtual ikea furniture in their own home with the help of augmented reality.

Augmented Reality News

Latest Developments in Augmented Reality (AR) News

The field of augmented reality is continually growing with new technology advancements, software improvements, and products. Staying up to date with the latest augmented reality news is important to stay on top of this rapidly growing industry. We cover the latest in augmented reality news, virtual reality news, and mixed reality news.

How Does Augmented Reality Work?

How Does Augmented Reality (AR) Technology Work?

In order to understand how augmented reality technology works, one must first understand its objective: to bring computer generated objects into the real world, which only the user can see.

In most augmented reality applications, a user will see both synthetic and natural light. This is done by overlaying projected images on top of a pair of see-through goggles or glasses, which allow the images and interactive virtual objects to layer on top of the user’s view of the real world. Augmented Reality devices are often self-contained, meaning that unlike the Oculus Rift or HTC Vive VR headsets, they are completely untethered and do not need a cable or desktop computer to function.

How Do Augmented Reality Devices Work (Inside)?

Augmented realities can be displayed on a wide variety of displays, from screens and monitors, to handheld devices or glasses. Google Glass and other head-up displays (HUD) put augmented reality directly onto your face, usually in the form of glasses. Handheld devices employ small displays that fit in users hands, including smartphones and tablets. As reality technologies continue to advance, augmented reality devices will gradually require less hardware and start being applied to things like contact lenses and virtual retinal displays.

Key Components to Augmented Reality Devices

1. Sensors and Cameras

HoloLens Augmented Reality Headset Sensors and CamerasSensors are usually on the outside of the augmented reality device, and gather a user’s real world interactions and communicate them to be processed and interpreted. Cameras are also located on the outside of the device, and visually scan to collect data about the surrounding area. The devices take this information, which often determines where surrounding physical objects are located, and then formulates a digital model to determine appropriate output. In the case of Microsoft Hololens, specific cameras perform specific duties, such as depth sensing. Depth sensing cameras work in tandem with two “environment understanding cameras” on each side of the device. Another common type of camera is a standard several megapixel camera (similar to the ones used in smartphones) to record pictures, videos, and sometimes information to assist with augmentation.

2. Projection

While “Projection Based Augmented Reality” is a category in-itself, we are specifically referring to a miniature projector often found in a forward and outward-facing position on wearable augmented reality headsets. The projector can essentially turn any surface into an interactive environment. As mentioned above, the information taken in by the cameras used to examine the surrounding world, is processed and then projected onto a surface in front of the user; which could be a wrist, a wall, or even another person. The use of projection in augmented reality devices means that screen real estate will eventually become a lesser important component. In the future, you may not need an iPad to play an online game of chess because you will be able to play it on the tabletop in front of you.

3. Processing

HoloLens Augmented Reality Headset Processing UnitAugmented reality devices are basically mini-supercomputers packed into tiny wearable devices. These devices require significant computer processing power and utilize many of the same components that our smartphones do. These components include a CPU, a GPU, flash memory, RAM, Bluetooth/Wifi microchip, global positioning system (GPS) microchip, and more. Advanced augmented reality devices, such as the Microsoft Hololens utilize an accelerometer (to measure the speed in which your head is moving), a gyroscope (to measure the tilt and orientation of your head), and a magnetometer (to function as a compass and figure out which direction your head is pointing) to provide for truly immersive experience.

4. Reflection

HoloLens Augmented Reality Headset Optics LensesMirrors are used in augmented reality devices to assist with the way your eye views the virtual image. Some augmented reality devices may have “an array of many small curved mirrors” (as with the Magic Leap augmented reality device) and others may have a simple double-sided mirror with one surface reflecting incoming light to a side-mounted camera and the other surface reflecting light from a side-mounted display to the user’s eye. In the Microsoft Hololens, the use of “mirrors” involves see-through holographic lenses (Microsoft refers to them as waveguides) that use an optical projection system to beam holograms into your eyes. A so-called light engine, emits the light towards two separate lenses (one for each eye), which consists of three layers of glass of three different primary colors (blue, green, red). The light hits those layers and then enters the eye at specific angles, intensities and colors, producing a final holistic image on the eye’s retina. Regardless of method, all of these reflection paths have the same objective, which is to assist with image alignment to the user’s eye.

How Augmented Reality is Controlled

Augmented reality devices are often controlled either by touch a pad or voice commands. The touch pads are often somewhere on the device that is easily reachable. They work by sensing the pressure changes that occur when a user taps or swipes a specific spot. Voice commands work very similar to the way they do on our smartphones. A tiny microphone on the device will pick up your voice and then a microprocessor will interpret the commands. Voice commands, such as those on the Google Glass augmented reality device, are preprogrammed from a list of commands that you can use. On the Google Glass, nearly all of them start with “OK, Glass,” which alerts your glasses that a command is soon to follow. For example, “OK, Glass, take a picture” will send a command to the microprocessor to snap a photo of whatever you’re looking at.

Augmented Reality Companies

Discover Innovative Augmented Reality (MR) Startups and Companies

Reality Technologies (Mixed Reality, Augmented Reality, and Virtual Reality) Companies and StartupsIt takes bold visionaries and risk-takers to build future technologies into realities. In the field of augmented reality (AR), there are many companies across the globe working on this mission. Our mega list of mixed reality, virtual reality, and augmented reality companies covers the top companies and startups who are innovating in this space.

Augmented Reality Use Case Example: Healthcare

How is Augmented Reality Used in the Real World? 

Many of the top augmented reality companies are seeing great success by helping seasoned industries adopt and apply this new technology for their unique business needs. A strong example of augmented reality in use is in the field of healthcare. From a routine checkup, to a complex surgical procedure, augmented reality can provide immense benefits and efficiencies to both patient and healthcare professional.

Physical Exams

Imagine that you walk into your scheduled doctor (or dentist) appointment, only to find your doctor (or dentist) wearing an augmented reality headset (e.g. Google Glass). Although it may look strange, this technology allows him (or her) to access past records, pictures, and other historical data in real-time to discuss with you. Instantly accessing this digital information without have to log into a computer or check a records room, proves to be a major benefit to healthcare professionals. Augmented Reality Examples in HealthcareIntegration of augmented reality assisted systems with patient record management technologies is already highly desirable utility. Data integrity and accessibility is a major benefit to this type of system, where record access becomes instantaneous and consistent across all professionals to the most current records, instructions, and policies.

Surgical Procedures

Let’s take this example one step further and imagine that we are going in for a surgical procedure. Before the anesthesia takes effect, we notice that the doctor is wearing an augmented reality headset. The doctor will use this throughout the procedure for things such as display of surgical checklists and display of patient vital signs in a dashboard fashion. Augmented reality assisted surgical technologies assist professionals by providing things such as interfaces to operating room medical devices, graphical overlay-based guidance, recording & archiving of procedures, live feeds to remote users, and instant access to patient records. They can also allow for computer generated images to be projected onto any part of the body for treatment or can be combined with scanned real time images. The benefits of using augmented reality include a reduced risk of delays in surgery due to lack of familiarity with new or old conditions, reduced risk of errors in performing surgical procedures, and reduced risk for contamination if the device allows surgeons to access information without having to remove gloves (i.e. hands-free) to check instruments and data.

6 Predictions for the Future of Artificial Intelligence

Artificial Intelligence is no longer a futuristic technology. It is not an attention-grabbing fiction infused tool that a mobile game development company considers important. It is already there allowing us to reap advantages like more precise predictions, the more adaptive behaviour of machines, context-aware machine reactions to voice commands of human, etc. Machines are continuing to imitate human intelligence and unleashing automated as well as responsive behaviour to human situations that were unpredictable in the past.

In the pace Artificial Intelligence is paving the way for better comfort, ease of use and multifaceted advantages for everyday life, soon we can see AI make a lot of things happen that were previously unthinkable. The latest AI research projects underway and the predictions about the roles of AI in future to come upholds a future which is equally bright and shrouded with anticipation.

Here we are going to explain six predictions about future of AI that seem credible.

1. Robots for disaster management

AI, which refers to the intelligence of machines, will make machines more responsive and aware of human contexts. If one facet of modern technology can be predicted to reap highest advantages of this new machine intelligence, it is nothing but the robotics. Robots powered by Artificial Intelligence will be able to do a lot of things that were previously ascribed to only humans. For example, for delicate and challenging roles like taking care of kids or elders robotscan be depended upon.

AI-powered robots will be able to tackle dangerous situations better than human beings. Robots will play a more proactive role in maintaining city traffic and managing disasters. For example, disasters like earthquakes and its after effects for certain areas can be envisaged and rightly predicted through modern analytics and accordingly as and when such disasters occur, AI-driven rescue apps can send humanitarian and precautionary messages to the residents. With the flood water crossing maximum limit, data-driven predictive analytics coupled up with AI can guide humanitarian aid to reach faster to the affected areas and its residents.

2. AI will be subject to misuse as well

In the way Artificial Intelligence is continuing to penetrate every area of our living and activities, machines and digital interfaces in the future will enjoy greater autonomy and power than ever before. This, in turn, can pave the way for vulnerabilities concerning security and misuse as well. By acquiring human-like cognitive abilities over many years machines and digital interfaces can pose a grave threat to human beings as well.

Though as of now machines behaving in an egoistic and biased manner just like the humans is unthinkable and mostly remained a phenomenon common to the fantasy world of science fiction, it can soon go to be a reality. Machines acquiring such psychological attributes of human can become dangerous to human autonomy and overall existence.

Artificial Intelligence of machines having full autonomy over user data can threaten privacy as such machine tools can process and utilise user data for further business purposes. AI generated customer interactions can prove to be a goldmine for the users, but such open and unrestricted access to machines to user data can have serious consequences on privacy.

3. Fully autonomous cars having the edge of AI

An Autonomous or driverless car is already a reality now, and just within a couple of years, we can expect them to hit the road as a regular vehicle. But as of now as most test drives confirmed driverless cars are only equipped to deal with road situations and driver’s safety, and they lack delicate-decision making power and ability to respond to multifarious situations like heavy rain, fog, snow, windstorms, etc. On a bigger scale, human intelligence is still irreplaceable for driving cars in general. The ever-increasing prowess of AI is giving us hope that in the time to come AI-powered driverless cars will have all the attributes of human drivers behind the wheel.

4. The threat of unpredictable superhuman abilities

We all have read and watched numerous science fiction where intelligent robots having superhuman abilities not just behave like humans but actually gets into the role of saviors and destructors of human beings. Well, that is now a possibility looming large in front of us. Many industry stalwarts and global tech thinkers, as well as scientists including Stephen Hawking, Elon Musk and Bill Gates, have expressed concerns over the role of AI in shaping superhuman intelligence that in the time to come can dominate human actions and behavior. The question is, whether this can be translated as a grim or bright future for the humans.

As of now, most predictions and concerns over the dominant role of AI have their source in one thing, and that is the sheer unpredictability of the AI-powered machines. AI is still in its nascent state when considered against the huge possibilities it offers for the future human beings. We can only say machines are going to have more similarities to human analytical abilities but how far machines can imitate human intelligence and to what extent it can be more intelligent than humans we do not know.

5. Cyborg technology

The way human brain functions in coordination with the millions of nerves spread all over our body is unique. This mind-body continuum is something that remained out of reach for makers of robots and researchers who for years are trying to shape machines capable of behaving and interacting humanely. But science fiction writers already came up with the concept of cyborgs, the robots loaded with human brain cells and neurons. These cyborgs have been the closest avatars of human beings with many things similar to humans.

If cyborgs have been a fictional possibility, latest stem cell research already made artificial limb production a possibility. In the time to come, AI coupled up with modern stem cell research can bring us similar nervous and neurological capabilities. Though cyborgs still seem to be a distant reality, the progress in AI technology and stem cell research together are making it a brighter possibility in the time to come.

6. Smart computers to solve climate change problem

Computers are now not only getting smarter with analytical abilities, but they can also answer questions with the awareness of user context. This enhanced ability which is getting better with every passing day ultimately can deliver insights about most complex and unpredictable fields of knowledge like the climate.

Having the ability to analyse unlimited data volume besides being able to analyse real-time situations, smart computers now can predict climate change and twists and turns of the environment more precisely. AI can work actively to prevent environmental catastrophe and can make humans beware of any impending disasters that can threaten life and living beings.

Finally, machines having acquired more human-like intelligence can become buddies of future humans. Already devices are our closest companions for greater part of the day. In the years to come, they will only acquire more humanely role and demeanour.

Information Technology and Development: Beyond “”Either/Or””

After years of drift and inattention to the problems of global development, during the past half decade the international community has dramatically increased its focus on strategies to help the people of the world’s poorest countries share in the benefits of globalization and escape the traps of poverty, disease, and lack of education. The decision of the world’s leaders at the United Nations Millennium Summit in September 2000 to adopt eight specific development goals provided an agreed political benchmark for measuring progress. Left open, however, were crucial issues about how best to achieve those goals.

A key unanswered question is the potential contribution that information and communication technology (ICT) can make to this effort. The question is not new. In 1984 the Commission for Worldwide Telecommunication Development (the Maitland Commission) issued an influential report, The Missing Link, citing the lack of telephone infrastructure in developing countries as a barrier to economic growth. The advent of the global information technology revolution in the 1990s set off a heated, sometimes acrimonious debate among development specialists and policymakers about the place of ICT in development.

On the one hand are those who see wiring the global South as a way to transcend decades of painful economic development and catapult even the poorest countries into the information age. As United Nations Secretary-General Kofi Annan observed in his Millennium Report, “New technology offers an unprecedented chance for developing countries to ‘leapfrog’ earlier stages of development. Everything must be done to maximize their peoples’ access to new information networks.” Proponents of this view not only stress the potential benefits of ICT but also argue that in an increasingly globalized economy, countries that fail to “get connected” will fall further and further behind.

At the opposite end are those who assert that “you can’t eat computers.” In the words of Microsoft’s Bill Gates, “Let’s be serious. Do people have a clear view of what it means to live on $1 a day? . . . There are things those people need at that level other than technology. . . . About 99 percent of the benefits of having [a PC] come when you’ve provided reasonable health and literacy to the person who’s going to sit down and use it.” Investing in ICT for poor countries, they argue, draws precious resources away from more urgent development needs. The lack of critical infrastructure, such as adequate energy grids, and of education keeps citizens of poorer countries from tapping ICT’s potential.

Eight Trends Driving the Future of Information Technology

he emerging world of information technology is one in which data is king, social platforms evolve as a new source of business intelligence, and cloud computing finally delivers on IT’s role as a driver of business growth, according to a new report from Accenture (NYSE: ACN).

The Accenture Technology Vision 2011 identifies eight emerging trends that challenge long-held assumptions about IT and are poised to reshape the business landscape. The report also offers “action steps” that high performing businesses and governments can take to prepare for the new world of computing.

PM Narendra Modi, UN chief Antonio Guterres discuss issues related to global peace

New Delhi: Prime Minister Narendra Modi Tuesday met United Nations General Secretary Antonio Guterres and discussed a wide range of issues pertaining to global peace and prosperity. Guterres arrived here Monday on his maiden visit to India as the head of the world body that coincided with the commencement of events marking the 150th birth anniversary of Mahatma Gandhi.

“Had a wonderful meeting with Secretary-General of the @UN, Mr. @antonioguterres. We discussed a wide range of issues pertaining to global peace and prosperity,” Modi tweeted. “We are extremely grateful to him for coming to India for the Mahatma Gandhi International Sanitation Convention,” he said.

Sustainable Development Goals

the 2030 Agenda for Sustainable Development, adopted by all United Nations Member States in 2015, provides a shared blueprint for peace and prosperity for people and the planet, now and into the future. At its heart are the 17 Sustainable Development Goals (SDGs), which are an urgent call for action by all countries – developed and developing – in a global partnership. They recognize that ending poverty and other deprivations must go hand-in-hand with strategies that improve health and education, reduce inequality, and spur economic growth – all while tackling climate change and working to preserve our oceans and forests.

The SDGs build on decades of work by countries and the UN, including the UN Department of Economic and Social Affairs

Today, the Division for Sustainable Development Goals (DSDG) in the United Nations Department of Economic and Social Affairs (UNDESA) provides substantive support and capacity-building for the SDGs and their related thematic issues, including waterenergyclimateoceansurbanizationtransportscience and technology, the Global Sustainable Development Report (GSDR)partnerships and Small Island Developing States. DSDG plays a key role in the evaluation of UN systemwide implementation of the 2030 Agenda and on advocacy and outreach activities relating to the SDGs. In order to make the 2030 Agenda a reality, broad ownership of the SDGs must translate into a strong commitment by all stakeholders to implement the global goals. DSDG aims to help facilitate this engagement.

Science in the Future of India

India has voted for Science. In May, half a billion people cast their ballots, and they decisively favored spurring the development of the world’s second most populous nation. The reelected Prime Minister Manmohan Singh and his new coalition government have made a commitment to reduce poverty and disease, create employment, and stimulate rural and industrial development. Attaining these goals will require substantial new investments in science and technology (S&T) plus much greater investments in human capital.

Since achieving freedom in 1947, India has established many institutions devoted to science and higher education. Most notably, five Indian Institutes of Technology (IITs) were established between 1951 and 1963, and by 2008 there were 13 IITs: national degree-granting institutions devoted to the training of high-quality engineers and scientists. Despite the gap in infrastructure between advanced countries and India, there have been critical successes in areas such as space, atomic energy, and agriculture. In fundamental research too, India has made progress. Because of the innumerable demands on the economy, however, the higher-education sector has not received adequate support. Part of the reason for the decline in India’s university science education system in the past decades has been the preferential funding for R&D activities in national research laboratories.

Prime Minister Singh has recently announced an increase in government investment in S&T from the present 1% of gross domestic product (GDP) to 2% of GDP over the next year or two, an increase of unprecedented magnitude. The contribution of industry has also increased significantly in the past few years, now amounting to approximately 20% of the nation’s total investment in science R&D. And the government has begun appropriate administrative reforms as well. For example, two new government departments dealing with Earth system science and health research have been created. In addition, the Indian parliament has approved creating a National Science and Engineering Research Board, an entity somewhat similar to the U.S. National Science Foundation (NSF), that will be responsible for funding scientific research. It will provide competitive grants and establish new facilities in priority areas. Like NSF, the board will also produce annual “science indicators”: detailed analyses for measuring progress in S&T from year to year.

CREDIT: JERRY COOKE/CORBIS

This is all good news. But the human resources essential for supporting an expanded S&T agenda are lacking. Young graduates today are readily attracted to professions other than those related to science and engineering; thus, banking, business, and information technology have become immensely popular. India must now focus on creating a large body of outstanding young people interested in taking up professions in science and engineering. To improve the quality of the university education system, new support is being provided. For example, five new Indian Institutes of Science Education and Research have been established in the past 3 years. Admitting undergraduates on the basis of competitive examinations (as do the IITs), these new national institutes will encourage bright young students to pursue science as a career, at both the undergraduate and Ph.D. levels. In addition, to meet the demand for top-class engineering graduates nationally and internationally, the country will increase the total number of IITs to 15.

Sixty percent of the Indian population is below the age of 25, and most reside in villages. This untapped talent represents a great potential asset. Around 600,000 scholarships are now available each year for talented school students from these areas, with an emphasis on those living in poverty. One million science awards are being given to students to promote interest in science, and 10,000 scholarships are available to support students who wish to pursue education beyond high school. In addition, the new government has already initiated important structural reforms in the education sector.

India’s citizens have risen to the occasion with their vote. The tasks and challenges for the new government are clear but daunting: It must now satisfy the aspirations of a billion people.