Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
All Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletter Hub
Free Learning
Arrow right icon
timer SALE ENDS IN
0 Days
:
00 Hours
:
00 Minutes
:
00 Seconds

Tech News - Mobile

204 Articles
article-image-google-daydream-powered-lenovo-mirage-solo-hits-the-market
Natasha Mathur
09 May 2018
3 min read
Save for later

Google Daydream powered Lenovo Mirage solo hits the market

Natasha Mathur
09 May 2018
3 min read
Just when people couldn’t keep up with the excitement of the Oculus Go launched at the Facebook’s F8 conference, Google added fuel to the fire by making Lenovo Mirage Solo, the first stand-alone Daydream VR headset, available for purchase at $399.9. Lenovo Mirage Solo VR headset Let’s have a look at the features that are making this headset all the rage: Self-contained VR Headset What makes this VR headset the talk of the town is that it’s the first stand-alone Daydream VR headset. That means it doesn’t require the excess baggage of connecting the phone and then putting on the headset. All you need to do is, just put the headset on and explore the intriguing VR world sans the wires and the added complexity. The hardware inside resembles that of a mobile device. It has a Snapdragon 835 processor with 4GB RAM, and 64GB of storage. It comes with a long battery life of 2.5 hours making the entire VR experience seamless. It consists of embedded sensors along with a gyroscope, accelerometer, and magnetometer. Also, it has a microSD slot, a USB Type-C port, a power button, volume buttons, and a 3.5mm headphone jack. Position-tracking Technology Lenovo Mirage Solo comes with WorldSense, an outstanding 6 degrees of freedom motion tracking feature that helps you move around freely with headsets on, thereby, making the entire experience more immersive. WorldSense helps remove the need to set up extra sensors. It offers: Two inside-out tracking cameras Built-in proximity sensors that detect the position of nearby objects Display Lenovo Mirage Solo comes with a 5.5-inch LCD display. This is an effort to get rid of the blurring issue that happens as you move from one side to the other in the VR world. The screen has a 2,560 x 1,440-pixel resolution with a 110-degree field of view which is similar to Rift and Vive, thereby, making the VR exploration even more interactive. Design The headset body is primarily matte plastic in white color with accents of black, and gray running through it, and a solid plastic strap that wraps around the head. The Lenovo Mirage solo is a self-contained headset, which has a strong built. Yet some people find it bulky as the majority of the weight resides on the top of a wearer’s forehead. However, it is adjustable as the headset can be brought all the way around your skull. Also, the Display housing keeps the light from coming in without disturbing the image, making the headset easily movable. Sound Lenovo Mirage Solo comes with two microphones, but users need to plug in their own headphones into the 3.5-mm jack as it doesn’t come equipped with in-built speakers. Apart from the above-mentioned features, the Mirage Solo depends on the Daydream library for accessing content. The catalog has more than 350 games and apps with over 70 titles optimized for WorldSense. As you can see, Mirage Solo is not flawless. It suffers from issues such as bulky design, no built-in speakers, and limited library app content. But the pros overpower the cons in this case, and it goes without saying that Lenovo Mirage Solo is here to revolutionize the VR experience. To know more, visit the official Daydream Google Blog Oculus Go, the first stand-alone VR headset arrives! Understanding the hype behind Magic Leap’s New Augmented Reality Headsets Build a Virtual Reality Solar System in Unity for Google Cardboard
Read more
  • 0
  • 0
  • 2344

article-image-google-open-sources-seurat-to-bring-high-precision-graphics-to-mobile-vr
Sugandha Lahoti
08 May 2018
2 min read
Save for later

Google open sources Seurat to bring high precision graphics to Mobile VR

Sugandha Lahoti
08 May 2018
2 min read
Google has open sourced Seurat, their VR positional-tracking tool. Google Seurat was first announced at their 2017 I/O conference to help developers bring high-precision graphics to standalone virtual reality headsets. Now, Google has open sourced Seurat to the developer community. Developers can now bring visually stunning scenes to their own VR applications while having the flexibility to customize the tool for their own workflows. Seurat can process complex 3D scenes, that couldn’t be run in real-time even on the highest performance desktop hardware, into a representation that renders efficiently on mobile hardware. How Google Seurat works Polygons are generally used to compose 3D images in computer graphics. The Polygon count refers to the number of polygons being rendered per frame. Google Seurat reduces the overall polygon count that is displayed at any given time and therefore lowers the required processing power and resources. It takes advantage of the limited viewing region, available in a mobile VR, to optimize the geometry and textures in a scene. What this means is, that Seurat takes all of the possible viewpoints that a VR user may see and removes the area of the 3D environment that they’d never be able to see. By utilizing the limited range of movement to their advantage, Seurat removes object permanence from the equation. So if users can’t see something in virtual reality, chances are it doesn’t actually exist.   On a more technical level, Google Seurat takes RGBD images (color and depth) as input and generates a textured mesh to simplify scenes. It targets a configurable number of triangles, texture size, and fill rate, to achieve this simplification process. Thus delivering immersive VR experiences on standalone headsets. This scene simplification technology was used to bring a 'Rogue One: A Star Wars Story' scene to a standalone VR experience. Developers can start working with Seurat right away with the GitHub page, containing the documentation and source code required to implement it into their projects. Alongside Seurat, Google also released Mirage Solo, the first headset on the Daydream VR platform. Top 7 modern Virtual Reality hardware systems Oculus Go, the first stand-alone VR headset arrives! Leap Motion open sources its $100 augmented reality headset, North Star  
Read more
  • 0
  • 0
  • 2679

article-image-googles-android-things-developer-preview-8-first-look
Kunal Chaudhari
07 May 2018
3 min read
Save for later

Google’s Android Things, developer preview 8: First look

Kunal Chaudhari
07 May 2018
3 min read
Last month, Google announced the final preview release of Android Things, Developer Preview 8, before the upcoming stable release. Earlier this year, Google also showcased the capabilities of Android Things for creating exciting IoT products like smart displays, speakers, and 3D printers at the Consumer Electronic Show (CES) with partners like partners like Lenovo, LG, JBL, iHome, and Sony. This developer preview solidifies the chances of Android Things becoming the official IoT platform for Google. This review focuses on the improvements related to installations of the apps and managing permissions on the platform. There will be no breaking changes to the Android Things API before the stable v1.0 SDK, which is expected to be released at Google I/O this month. Let’s take a closer look at some of the notable features in this new release: Production-focused console enhancements There have been several changes made to the Android Things developer console with a focus on building and shipping production ready devices: Visual storage layout: This feature helps in configuring the device storage allocated to apps and data for each build, and helps in getting an overview of how much storage your apps require. Font/locale controls: These controls are a set of supported fonts and locales packaged into each build. Group sharing: Product sharing has been extended to include support for Google Groups. App library This is a new feature added to the Android Things ecosystem, which allows developers to manage APKs more easily without the need to package them together in a separate zipped bundle. It helps you to track individual versions of your app, review permissions, and share your app with other console users. Take a look at the official documentation for more details on App Library. Permissions This feature is quite useful to developers as it gives them more control over the permissions used by apps on their devices. When working with mobile devices, apps request permissions at runtime and the end users grant them. The early releases of Android Things used to grant these permissions automatically while booting the device. But with Developer preview 8, these permissions will be granted with the help of a new user interface in the developer console. These are just a select few of the myriad of updates available within the developer preview of Android Things. To read the entire list of changes please check out the release notes. Getting Started with Android Things Build your first Android app with Kotlin
Read more
  • 0
  • 0
  • 2105
Visually different images
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $15.99/month. Cancel anytime
article-image-oculus-go-the-first-stand-alone-vr-headset-arrives
Sugandha Lahoti
03 May 2018
3 min read
Save for later

Oculus Go, the first stand alone VR headset arrives!

Sugandha Lahoti
03 May 2018
3 min read
At the two-day F8 conference hosted by Facebook, Oculus unveiled a new virtual reality headset. The Oculus Go is priced at an astonishing $199, a lot lesser than its predecessors. (Oculus Rift headset costs around $399). Here’s a quick rundown of all the key features. Self-contained headset The Oculus Go is completely self- contained and stand alone. Everything including the hardware, screen, and processor, is contained within the headset. The functional pistol-grip Oculus controller is also included in the box. Developers don’t need a special computer, graphics card, game console or even a phone to operate the VR device as it is completely autonomous. Rich Display Oculus is equipped with a 5.5-inch, 2,560x1,440-pixel LCD display that looks particularly crisp when reading text or watching videos.  It uses optimized 3D graphics which reduced the screen-door effect typically encountered on most VR headsets. It uses fixed foveated rendering, rendering the area at the center of the display more sharply than the edges, to make many apps look even better. Powerful Sound Spatial audio drivers are built into the headset which provides direct, immersive, surround sound, without the need for earphones. Alternatively, it also has a 3.5mm audio jack. Lightweight and Comfortable The Oculus Go is comfortable and well designed. The Go goggles have breathable fabrics, injection foam molding, and other advances in wearable materials for better comfort. It is also lightweight and portable. For all the great features, the product is not without its limitations. The screen has a narrower field of view (FOV) than Oculus Rift and HTC Vive. It also does not include a slider or scalar to adjust "interpupillary distance", i.e. how images line up with your own face. Oculus Go only recognizes three degrees of freedom (3DOF). So, realistic VR effects remain when you rotate or tilt your head. However, the effect is broken as you lean in any direction. Go does not offer positional tracking while seated or while walking.   Nevertheless, Oculus currently supports over 1,000 existing apps, and pairs with both iPhones and Android phones, making it one of the best iPhone VR headsets around right now.  The pricing is set to be $199 USD for the 32GB model and $249 for the 64GB version. Consumers can now purchase the headset via the Oculus website in 23 countries. Facebook’s F8 Conference – 5 key announcements Understanding the hype behind Magic Leap’s New Augmented Reality Headsets Leap Motion open sources its $100 augmented reality headset, North Star
Read more
  • 0
  • 0
  • 2531

article-image-virtual-reality-solar-system-unity-google-cardboard
Sugandha Lahoti
25 Apr 2018
21 min read
Save for later

Build a Virtual Reality Solar System in Unity for Google Cardboard

Sugandha Lahoti
25 Apr 2018
21 min read
In today's tutorial, we will feature visualization of a newly discovered solar system. We will leverage the virtual reality development process for this project in order to illustrate the power of VR and ease of use of the Unity 3D engine. This project is dioramic scene, where the user floats in space, observing the movement of planets within the TRAPPIST-1 planetary system. In February 2017, astronomers announced the discovery of seven planets orbiting an ultra-cool dwarf star slightly larger than Jupiter. We will use this information to build a virtual environment to run on Google Cardboard (Android and iOS) or other compatible devices: We will additionally cover the following topics: Platform setup: Download and install the platform-specific software needed to build an application on your target device. Experienced mobile developers with the latest Android or iOS SDK may skip this step. Google Cardboard setup: This package of development tools facilitates display and interaction on a Cardboard device. Unity environment setup: Initializing Unity's Project Settings in preparation for a VR environment. Building the TRAPPIST-1 system: Design and implement the Solar System project. Build for your device: Build and install the project onto a mobile device for viewing in Google Cardboard. Platform setup Before we begin building the solar system, we must setup our computer environment to build the runtime application for a given VR device. If you have never built a Unity application for Android or iOS, you will need to download and install the Software Development Kit (SDK) for your chosen platform. An SDK is a set of tools that will let you build an application for a specific software package, hardware platform, game console, or operating system. Installing the SDK may require additional tools or specific files to complete the process, and the requirements change from year to year, as operating systems and hardware platforms undergo updates and revisions. To deal with this nightmare, Unity maintains an impressive set of platform-specific instructions to ease the setup process. Their list contains detailed instructions for the following platforms: Apple Mac Apple TV Android iOS Samsung TV Standalone Tizen Web Player WebGL Windows For this project, we will be building for the most common mobile devices: Android or iOS. The first step is to visit either of the following links to prepare your computer: Android: Android users will need the Android Developer Studio, Java Virtual Machine (JVM), and assorted drivers. Follow this link for installation instructions and files: https://docs.unity3d.com/Manual/Android-sdksetup.html. Apple iOS: iOS builds are created on a Mac and require an Apple Developer account, and the latest version of Xcode development tools. However, if you've previously built an iOS app, these conditions will have already been met by your system. For the complete instructions, follow this link: https://docs.unity3d.com/Manual/iphone-GettingStarted.html. Google Cardboard setup Like the Unity documentation website, Google also maintains an in-depth guide for the Google VR SDK for Unity set of tools and examples. This SDK provides the following features on the device: User head tracking Side-by-side stereo rendering Detection of user interactions (via trigger or controller) Automatic stereo configuration for a specific VR viewer Distortion correction Automatic gyro drift correction These features are all contained in one easy-to-use package that will be imported into our Unity scene. Download the SDK from the following link, before moving on to the next step: http://developers.google.com/cardboard/unity/download. At the time of writing, the current version of the Google VR SDK for Unity is version 1.110.1 and it is available via a GitHub repository. The previous link should take you to the latest version of the SDK. However, when starting a new project, be sure to compare the SDK version requirements with your installed version of Unity. Setting up the Unity environment Like all projects, we will begin by launching Unity and creating a new project. The first steps will create a project folder which contains several files and directories: Launch the Unity application. Choose the New option after the application splash screen loads. Create a new project by launching the Unity application. Save the project as Trappist1 in a location of your choice, as demonstrated in Figure 2.2: To prepare for VR, we will adjust the Build Settings and Player Settings windows. Open Build Settings from File | Build Settings. Select the Platform for your target device (iOS or Android). Click the Switch Platform button to confirm the change. The Unity icon in the right-hand column of the platform panel indicates the currently selected build platform. By default, it will appear next to the Standalone option. After switching, the icon should now be on Android or iOS platform, as shown in Figure 2.3: Note for Android developers: Ericsson Texture Compression (ETC) is the standard texture compression format on Android. Unity defaults to ETC (default), which is supported on all current Android devices, but it does not support textures that have an alpha channel. ETC2 supports alpha channels and provides improved quality for RBG textures on Android devices that support OpenGL ES 3.0. Since we will not need alpha channels, we will stick with ETC (default) for this project: Open the Player Settings by clicking the button at the bottom of the window. The PlayerSetting panel will open in the Inspector panel. Scroll down to Other Settings (Unity 5.5 thru 2017.1) or XR Settings and check the Virtual Reality Supported checkbox. A list of choices will appear for selecting VR SDKs. Add Cardboard support to the list, as shown in Figure 2.4: You will also need to create a valid Bundle Identifier or Package Name under Identification section of Other Settings. The value should follow the reverse-DNS format of the com.yourCompanyName.ProjectName format using alphanumeric characters, periods, and hyphens. The default value must be changed in order to build your application. Android development note: Bundle Identifiers are unique. When an app is built and released for Android, the Bundle Identifier becomes the app's package name and cannot be changed. This restriction and other requirements are discussed in this Android documentation link: http://developer.Android.com/reference/Android/content/pm/PackageInfo.html. Apple development note: Once you have registered a Bundle Identifier to a Personal Team in Xcode, the same Bundle Identifier cannot be registered to another Apple Developer Program team in the future. This means that, while testing your game using a free Apple ID and a Personal Team, you should choose a Bundle Identifier that is for testing only, you will not be able to use the same Bundle Identifier to release the game. An easy way to do this is to add Test to the end of whatever Bundle Identifier you were going to use, for example, com.MyCompany.VRTrappistTest. When you release an app, its Bundle Identifier must be unique to your app, and cannot be changed after your app has been submitted to the App Store. Set the Minimum API Level to Android Nougat (API level 24) and leave the Target API on Automatic. Close the Build Settings window and save the project before continuing. Choose Assets | Import Package | Custom Package... to import the GoogleVRForUnity.unitypackage previously downloaded from http://developers.google.com/cardboard/unity/download. The package will begin decompressing the scripts, assets, and plugins needed to build a Cardboard product. When completed, confirm that all options are selected and choose Import. Once the package has been installed, a new menu titled GoogleVR will be available in the main menu. This provides easy access to the GoogleVR documentation and Editor Settings. Additionally, a directory titled GoogleVR will appear in the Project panel: Right-click in the Project and choose Create | Folder to add the following directories: Materials, Scenes, and Scripts. Choose File | Save Scenes to save the default scene. I'm using the very original Main Scene and saving it to the Scenes folder created in the previous step. Choose File | Save Project from the main menu to complete the setup portion of this project. Building the TRAPPIST-1 System Now that we have Unity configured to build for our device, we can begin building our space themes VR environment. We have designed this project to focus on building and deploying a VR experience. If you are moderately familiar with Unity, this project will be very simple. Again, this is by design. However, if you are relatively new, then the basic 3D primitives, a few textures, and a simple orbiting script will be a great way to expand your understanding of the development platform: Create a new script by selecting Assets | Create | C# Script from the main menu. By default, the script will be titled NewBehaviourScript. Single click this item in the Project window and rename it OrbitController. Finally, we will keep the project organized by dragging OrbitController's icon to the Scripts folder. Double-click the OrbitController script item to edit it. Doing this will open a script editor as a separate application and load the OrbitController script for editing. The following code block illustrates the default script text: using System.Collections; using System.Collections.Generic; using UnityEngine; public class OrbitController : MonoBehaviour { // Use this for initialization void Start () { } // Update is called once per frame void Update () { } } This script will be used to determine each planet's location, orientation, and relative velocity within the system. The specific dimensions will be added later, but we will start by adding some public variables. Starting on line 7, add the following five statements: public Transform orbitPivot; public float orbitSpeed; public float rotationSpeed; public float planetRadius; public float distFromStar; Since we will be referring to these variables in the near future, we need a better understanding of how they will be used: orbitPivot stores the position of the object that each planet will revolve around (in this case, it is the star TRAPPIST-1). orbitalSpeed is used to control how fast each planet revolves around the central star. rotationSpeed is how fast an object rotates around its own axis. planetRadius represents a planet's radius compared to Earth. This value will be used to set the planet's size in our environment. distFromStar is a planet's distance in Astronomical Units (AU) from the central star. Continue by adding the following lines of code to the Start() method of the OrbitController script: // Use this for initialization void Start () { // Creates a random position along the orbit path Vector2 randomPosition = Random.insideUnitCircle; transform.position = new Vector3 (randomPosition.x, 0f, randomPosition.y) * distFromStar; // Sets the size of the GameObject to the Planet radius value transform.localScale = Vector3.one * planetRadius; } As shown within this script, the Start() method is used to set the initial position of each planet. We will add the dimensions when we create the planets, and this script will pull those values to set the starting point of each game object at runtime: Next, modify the Update() method by adding two additional lines of code, as indicated in the following code block: // Update is called once per frame. This code block updates the Planet's position during each // runtime frame. void Update () { this.transform.RotateAround (orbitPivot.position, Vector3.up, orbitSpeed * Time.deltaTime); this.transform.Rotate (Vector3.up, rotationSpeed * Time.deltaTime); } This method is called once per frame while the program is running. Within Update(), the location for each object is determined by computing where the object should be during the next frame. this.transform.RotateAround uses the sun's pivot point to determine where the current GameObject (identified in the script by this) should appear in this frame. Then this.transform.Rotate updates how much the planet has rotated since the last frame. Save the script and return to Unity. Now that we have our first script, we can begin building the star and its planets. For this process, we will use Unity's primitive 3D GameObject to create the celestial bodies: Create a new sphere using GameObject | 3D Object | Sphere. This object will represent the star TRAPPIST-1. It will reside in the center of our solar system and will serve as the pivot for all seven planets. Right-click on the newly created Sphere object in the Hierarchy window and select Rename. Rename the object Star. Using the Inspector tab, set the object to Position: 0,0,0 and Scale: 1,1,1. With the Star selected, locate the Add Component button in the Inspector panel. Click the button and enter orbitcontroller in the search box. Double-click on the OrbitController script icon when it appears. The script is now a component of the star. Create another sphere using GameObject | 3D Object | Sphere and position it anywhere in the scene, with the default scale of 1,1,1. Rename the object Planet b. Figure 2.5, from the TRAPPIST-1 Wikipedia page, shows the relative orbital period, distance from the star, radius, and mass of each planet. We will use these dimensions and names to complete the setup of our VR environment. Each value will be entered as public variables for their associated GameObjects: Apply the OrbitController script to the Planet b asset by dragging the script icon to the planet in the Scene window or the Planet b object in the Hierarchy window. Planet b is our first planet and it will serve as a prototype for the rest of the system. Set the Orbit Pivot point of Planet b in the Inspector. Do this by clicking the Selector Target next to the Orbit Pivot field (see Figure 2.6). Then, select Star from the list of objects. The field value will change from None (Transform) to Star (Transform). Our script will use the origin point of the select GameObject as its pivot point. Go back and select the Star GameObject and set the Orbit Pivot to Star as we did with Planet b. Save the scene: Now that our template planet has the OrbitController script, we can create the remaining planets: Duplicate the Planet b GameObject six times, by right-clicking on it and choosing Duplicate. Rename each copy Planet c through Planet h. Set the public variables for each GameObject, using the following chart: GameObject Orbit Speed Rotation Speed Planet Radius Dist From Star Star 0 2 6 0 Planet b .151 5 0.85 11 Planet c .242 5 1.38 15 Planet d .405 5 0.41 21 Planet e .61 5 0.62 28 Planet f .921 5 0.68 37 Planet g 1.235 5 1.34 45 Planet h 1.80 5 0.76 60 Table 2.1: TRAPPIST-1 gameobject Transform settings Create an empty GameObject by right clicking in the Hierarchy panel and selecting Create Empty. This item will help keep the Hierarchy window organized. Rename the item Planets and drag Planet b—through Planet h into the empty item. This completes the layout of our solar system, and we can now focus on setting a location for the stationary player. Our player will not have the luxury of motion, so we must determine the optimal point of view for the scene: Run the simulation. Figure 2.7 illustrates the layout being used to build and edit the scene. With the scene running and the Main Camera selected, use the Move and Rotate tools or the Transform fields to readjust the position of the camera in the Scene window or to find a position with a wide view of the action in the Game window; or a position with an interesting vantage point. Do not stop the simulation when you identify a position. Stopping the simulation will reset the Transform fields back to their original values. Click the small Options gear in the Transform panel and select Copy Component. This will store a copy of the Transform settings to the clipboard: Stop the simulation. You will notice that the Main Camera position and rotation have reverted to their original settings. Click the Transform gear again and select Paste Component Values to set the Transform fields to the desired values. Save the scene and project. You might have noticed that we cannot really tell how fast the planets are rotating. This is because the planets are simple spheres without details. This can be fixed by adding materials to each planet. Since we really do not know what these planets look like we will take a creative approach and go for aesthetics over scientific accuracy. The internet is a great source for the images we need. A simple Google search for planetary textures will result in thousands of options. Use a collection of these images to create materials for the planets and the TRAPPIST-1 star: Open a web browser and search Google for planet textures. You will need one texture for each planet and one more for the star. Download the textures to your computer and rename them something memorable (that is, planet_b_mat...). Alternatively, you can download a complete set of textures from the Resources section of the supporting website: http://zephyr9.pairsite.com/vrblueprints/Trappist1/. Copy the images to the Trappist1/Assets/Materials folder. Switch back to Unity and open the Materials folder in the Project panel. Drag each texture to its corresponding GameObject in the Hierarchy panel. Notice that each time you do this Unity creates a new material and assigns it to the planet GameObject: Run the simulation again and observe the movement of the planets. Adjust the individual planet Orbit Speed and Rotation Speed to feel natural. Take a bit of creative license here, leaning more on the scene's aesthetic quality than on scientific accuracy. Save the scene and the project. For the final design phase, we will add a space themed background using a Skybox. Skyboxes are rendered components that create the backdrop for Unity scenes. They illustrate the world beyond the 3D geometry, creating an atmosphere to match the setting. Skyboxes can be constructed of solids, gradients, or images using a variety of graphic programs and applications. For this project, we will find a suitable component in the Asset Store: Load the Asset Store from the Window menu. Search for a free space-themed skybox using the phrase space skybox price:0. Select a package and use the Download button to import the package into the Scene. Select Window | Lighting | Settings from the main menu. In the Scene section, click on the Selector Target for the Skybox Material and choose the newly downloaded skybox: Save the scene and the project. With that last step complete, we are done with the design and development phase of the project. Next, we will move on to building the application and transferring it to a device. Building the application To experience this simulation in VR, we need to have our scene run on a head-mounted display as a stereoscopic display. The app needs to compile the proper viewing parameters, capture and process head tracking data, and correct for visual distortion. When you consider the number of VR devices we would have to account for, the task is nothing short of daunting. Luckily, Google VR facilitates all of this in one easy-to-use plugin. The process for building the mobile application will depend on the mobile platform you are targeting. If you have previously built and installed a Unity app on a mobile device, many of these steps will have already been completed, and a few will apply updates to your existing software. Note: Unity is a fantastic software platform with a rich community and an attentive development staff. During the writing of this book, we tackled software updates (5.5 through 2017.3) and various changes in the VR development process. Although we are including the simplified building steps, it is important to check Google's VR documentation for the latest software updates and detailed instructions: Android: https://developers.google.com/vr/unity/get-started iOS: https://developers.google.com/vr/unity/get-started-ios Android Instructions If you are just starting out building applications from Unity, we suggest starting out with the Android process. The workflow for getting your project export from Unity to playing on your device is short and straight forward: On your Android device, navigate to Settings | About phone or Settings | About Device | Software Info. Scroll down to Build number and tap the item seven times. A popup will appear, confirming that you are now a developer. Now navigate to Settings | Developer options | Debugging and enable USB debugging. Building an Android application In your project directory (at the same level as the Asset folder), create a Build folder. Connect your Android device to the computer using a USB cable. You may see a prompt asking you to confirm that you wish to enable USB debugging on the device. If so, click OK. In Unity, select File | Build Settings to load the Build dialog. Confirm that the Platform is set to Android. If not choose Android and click Switch Platform. Note that Scenes/Main Scene should be loaded and checked in the Scenes In Build portion of the dialog. If not, click the Add Open Scenes button to add Main Scene to the list of scenes to be included in the build. Click the Build button. This will create an Android executable application with the .apk file extension. Invalid command Android error Some Android users have reported an error relating to the Android SDK Tools location. The problem has been confirmed in many installations prior to Unity 2017.1. If this problem occurs, the best solution is to downgrade to a previous version of the SDK Tools. This can be done by following the steps outlined here: Locate and delete the Android SDK Tools folder [Your Android SDK Root]/tools. This location will depend on where the Android SDK package was installed. For example, on my computer the Android SDK Tools folder is found at C:UserscpalmerAppDataLocalAndroidsdk. Download SDK Tools from http://dl-ssl.google.com/Android/repository/tools_r25.2.5-windows.zip. Extract the archive to the SDK root directory. Re-attempt the Build project process. If this is the first time you are creating an Android application, you might get an error indicating that Unity cannot locate your Android SDK root directory. If this is the case, follow these steps: Cancel the build process and close the Build Settings... window. Choose Edit | Preferences... from the main menu. Choose External Tools and scroll down to Android. Enter the location of your Android SDK root folder. If you have not installed the SDK, click the download button and follow the installation process. Install the app onto your phone and load the phone into your Cardboard device: iOS Instructions The process for building an iOS app is much more involved than the Android process. There are two different types of builds: Build for testing Build for distribution (which requires an Apple Developer License) In either case, you will need the following items to build a modern iOS app: A Mac computer running OS X 10.11 or later The latest version of Xcode An iOS device and USB cable An Apple ID Your Unity project For this demo, we will build an app for testing and we will assume you have completed the Getting Started steps (https://docs.unity3d.com/Manual/iphone-GettingStarted.html) from Section 1. If you do not yet have an Apple ID, obtain one from the Apple ID site (http://appleid.apple.com/). Once you have obtained an Apple ID, it must be added to Xcode: Open Xcode. From the menu bar at the top of the screen, choose Xcode | Preferences. This will open the Preferences window. Choose Accounts at the top of the window to display information about the Apple IDs that have been added to Xcode. To add an Apple ID, click the plus sign at the bottom left corner and choose Add Apple ID. Enter your Apple ID and password in the resulting popup box. Your Apple ID will then appear in the list. Select your Apple ID. Apple Developer Program teams are listed under the heading of Team. If you are using the free Apple ID, you will be assigned to Personal Team. Otherwise, you will be shown the teams you are enrolled in through the Apple Developer Program. Preparing your Unity project for iOS Within Unity, open the Build Settings from the top menu (File | Build Settings). Confirm that the Platform is set to iOS. If not choose iOS and click Switch Platform at the bottom of the window. Select the Build & Run button. Building an iOS application Xcode will launch with your Unity project. Select your platform and follow the standard process for building an application from Xcode. Install the app onto your phone and load the phone into your Cardboard device. We looked at the basic Unity workflow for developing VR experiences. We also provided a stationary solution so that we could focus on the development process. The Cardboard platform provides access to VR content from a mobile platform, but it also allows for touch and gaze controls. You read an excerpt from the book, Virtual Reality Blueprints, written by Charles Palmer and John Williamson. In this book, you will learn how to create compelling Virtual Reality experiences for mobile and desktop with three top platforms—Cardboard VR, Gear VR, and OculusVR. Read More Top 7 modern Virtual Reality hardware systems Virtual Reality for Developers: Cardboard, Gear VR, Rift, and Vive    
Read more
  • 0
  • 2
  • 9811

article-image-understanding-the-hype-behind-magic-leaps-new-augmented-reality-headsets
Kunal Chaudhari
20 Apr 2018
4 min read
Save for later

Understanding the hype behind Magic Leap’s New Augmented Reality Headsets

Kunal Chaudhari
20 Apr 2018
4 min read
After 6 years of long anticipation, Magic Leap, the secretive billion dollar startup has finally unveiled its first augmented reality headset. This mysterious new device is supposedly priced at $1000 and hosts a variety of interesting new features. Let’s take a look at why this company, which is notoriously known to work in the “stealth mode”, has been gaining so much popularity. Magic Leap Origins Magic Leap was founded in 2010, by Rony Abovitz, a tech-savvy American entrepreneur. He previously founded a company which manufactured surgical robotic arm assistance platforms. But it was not until October 2014, when the company started to make the rounds in news by receiving $540 million of venture funding from Google, Qualcomm, Andreessen Horowitz, and Kleiner Perkins, among other leading tech investors. Some saw this funding as a desperate attempt from Google to match Facebook’s acquisition of Oculus, a virtual reality startup. This exaggerated valuation was based on little more than an ambitious vision of layering digital images on top of real-world objects with spatial glasses, and with no actual revenue or any products to show. The Anticipation After a year of the initial round of fundings, Magic Leap released a couple of cool demos. https://www.youtube.com/watch?v=kPMHcanq0xM Just another day in the office at Magic Leap https://www.youtube.com/watch?v=kw0-JRa9n94 Everyday Magic with Mixed Reality Both these videos showcased augmented reality gaming and productivity applications. While the description in the first one mentioned that it was just a concept video that highlights the potential of augmented reality, the second video claimed that it was shot from the actual device without the use of any special effects. These demos skyrocketed the popularity of Magic Leap creating huge anticipation among the users, developers, and investors alike. This hype attracted the likes of Alibaba and Disney to join hands with them in their quest for the next generation Augmented Reality device. Product Announcement and Pricing After 4 years of hype videos and almost 2 billion dollars in funding Magic Leap finally unveiled their first product called Magic Leap One Creator Edition. These headsets are specifically catered to developers and will start shipping later this year. The Creator Edition consists of three pieces of hardware: Source: Magic Leap Official Website Lightwear: It’s the actual headset which uses “Digital Lightfield” display technology with multiple integrated sensors to gather spatial information. Lightpack: The core computing power of the headsets lies in the Lightpack, a circular belt-worn hip pack which is connected to the headset. Controller: It is a remote that contains buttons, six-degrees of freedom motion sensing, touchpad, and haptic feedback. The remote-shaped controller appears to be very similar to what we can see in Samsung Gear VR and Google Daydream headset controllers. Along with the headsets, Magic Leap also launched the Lumin SDK, the toolkit which allows developers to build AR experiences for Lumin OS, the operating system that powers the Magic Leap One headset. There’s more! Magic Leap has made their SDK available in both Unity and Unreal game engines. This would allow a wide range of developers to start creating augmented reality experiences for their respective platforms. Although Magic Leap hasn’t shared any details on the exact pricing of the headsets, but if you go by what Rony Abovitz said in an interview, the price of the headset would be similar to that of a “Premium Computer”. He also mentioned that the company is planning to develop high-end devices for enterprises as well as lower-end versions for the common masses. Product trial shrouded in secrets Magic Leap, since their inception, have been claiming to revolutionize the AR/VR space with their mysterious technology. They boast that their proprietary features like “Digital Lightfield” and “Visual Perception”  would solve the long-standing problem of dizziness caused due to the continuous use of these headsets. Still, a lot of specifications are missing, like the field of view, or the processing power of the Lightpack processor. To add to the ambiguity, Magic Leap released a long list of security clauses for the developers who want to try out their products, some almost asking the developers to “lock away the hardware”. But this isn’t stopping the investors from pouring in more funds. Magic Leap just received another $461 million dollars from a Saudi Arabia sovereign investment arm. The uncertainty will only be cleared when the headsets become production ready and reach the consumers. Until then the hype remains... To know more about other features of Magic Leap One, check out their official webpage.
Read more
  • 0
  • 0
  • 2829
article-image-leap-motion-open-sources-its-100-augmented-reality-headset-north-star
Kunal Chaudhari
13 Apr 2018
3 min read
Save for later

Leap Motion open sources its $100 augmented reality headset, North Star

Kunal Chaudhari
13 Apr 2018
3 min read
Leap Motion, famous for building hand tracking hardware and software announced its move into the Augmented reality space with the project ‘North Star’- an augmented reality platform. They are planning to open source this project which includes a design for a headset that Leap Motion claims costs less than $100 at large-scale production. Image Credits: Leap Motion Official Blog Founded in 2010, Leap Motion first ventured into hand tracking technology by announcing their own set of motion controllers. This allowed users to interact with a computer by waving their hands, fingers, or other digits around to control games, maps, or other apps. While the technology was cool at that time, it was unimpressive to certain users because of the sensitivity of their controllers and the lack of apps available to play with. But the company’s still around, and now Leap Motion is unveiling something that could be revolutionary, or it could just be another cool idea that fails to catch on. Here’s a closer look: Design Project North Star isn’t a new consumer headset, the company is also releasing the necessary hardware specifications, designs, and software under an open source license. The headset design uses two fast-refreshing 3.5-inch LCD displays with a resolution of 1600x1440 per eye with 120fps and a 100-degree field in view. It also features Leap Motion’s 180-degree hand tracking sensor. The company claims that it offers a wider range of view than most AR systems that exist today, specifically comparing it with Microsoft Hololens which offers a 70-degree field of view. Most of the existing virtual reality and augmented reality headsets require handled controllers for input but with the Leap Motion sensor, users don’t need to hold anything in their hands at all. Pricing Leap Motion doesn’t plan to sell the headset, but it’ll make the hardware and software open source. They hope that someone else will build and sell the headsets, which the company says could cost less than $100 to produce. David Holz, the chief technology officer at Leap Motion mentioned in a blog post that “Although this is an experimental platform right now, we expect that the design itself will spawn further endeavors that will become available for the rest of the world”. This suggests that with relatively low cost and open source hardware third parties can experiment with the technology. While all these features sound promising, there are still plenty of details which are yet to be revealed. A thorough comparison with other prominent AR devices like Magic Leap and Hololens is necessary to identify Leap Motion’s true potential. Till then you can visit their Official Webpage to see some cool demos. Check out other latest news: Windows launches progressive web apps… that don’t yet work on mobile Verizon launches AR Designer, a new tool for developers
Read more
  • 0
  • 1
  • 3206

article-image-windows-launches-progressive-web-apps
Richard Gall
09 Apr 2018
2 min read
Save for later

Windows launches progressive web apps... that don't yet work on mobile

Richard Gall
09 Apr 2018
2 min read
Progressive web apps are now available on the Microsoft store. But just when you thought Microsoft was taking a step to plug the 'app gap' and catch up with the competition... This first wave of progressive web apps won't actually work on Windows mobile. One of the central problems with the new Windows progressive web apps is that they do not have service workers implemented for Edge mobile - that means they aren't able to send push notifications. This is bad news generally for the Windows 10 mobile platform. It's possible that Microsoft might add further updates for progressive web apps on mobile, but it nevertheless sends signals that Microsoft just doesn't have the hunger to commit to their mobile project. As we've seen just a few days ago, the company more broadly appears to be moving towards infrastructure and cloud projects. The issues around progressive web apps might well just be symptomatic of this broader organizational shift. For TechRadar, this is a misstep by Microsoft. "There’s very little evidence out there that Microsoft is willing to put in the massive effort needed to get back on terms with iOS and Android devices, even in the enterprise sector, so the future doesn’t look too rosy at the moment." However, while disappointment is understandable, there's a chance that these issues will be corrected. It wouldn't actually take that much for Microsoft to fix the problem. Development teams could then deploy updates to their respective applications pretty easily, without having to go through the rigmarole of submitting to the app store once again. The list of companies who have PWAs available are, we should note, pretty impressive. It's clear that some big players in a number of different fields want to get involved: Skyscanner Asos Ziprecruiter Oyster StudentDoctorNetwork What this means for the future of Windows mobile isn't clear. It certainly doesn't look great from Microsoft's perspective, and you could say this has been a bit of a missed opportunity. But all is not lost, and they could quickly recover to use PWAs to redefine the future of its mobile offering. Check out other latest news: Verizon launches AR Designer, a new tool for developers Leap Motion open sources its $100 augmented reality headset, North Star
Read more
  • 0
  • 0
  • 2676

article-image-verizon-launches-ar-designer-new-ar-tool-for-developers
Richard Gall
09 Apr 2018
2 min read
Save for later

Verizon launches AR Designer, a new tool for developers

Richard Gall
09 Apr 2018
2 min read
Verizon's AR creative studio Envrmnt has launched an app for creating augmented reality applications: AR Designer. According to Verizon, it's an easy to use tool that allows developers and even non-technical creatives create augmented reality experiences for mobile apps. This might just be the thing to establish augmented reality in the mainstream. AR Designer makes adding AR capabilities to apps easy Ease of use is one of the key features of AR Designer. This was something T.J. Vitolo, the director of Product Management and Development at Verizon was keen to make clear: "Today’s mobile-first users expect brands, public services, and even their employers to evolve to meet their changing technology expectations for interacting with them,AR Designer enables anyone to build virtual experiences and incorporate them into their mobile application without having to hire a full development team. With AR Designer, app publishers can quickly and easily deploy a diverse set of AR experiences which can result in sales growth, a more informed public, or more effective employees." Watch this video to find out more: https://www.youtube.com/watch?v=thedlHq2fFM AR Designer is debuting at the NAB Show in Las Vegas this week. It will then be available for use to the wider public, following early trial periods with a number of Verizon's key partners. It will be interesting to see how quickly organizations move to integrate the AR Designer SDK into their mobile applications. It could be said that the tool represents a clear example of a common trend for software development tools to be built with ease of use in mind. The team at Verizon have identified that there's a lot of potential in lowering the bar of access to technical tools. How this impacts the way creative and development teams work together in the future will also be interesting to watch. Check the news page for other latest news on this topic: Windows launch progressive web apps… that don’t yet work on mobile Leap Motion open sources its $100 augmented reality headset, North Star  
Read more
  • 0
  • 0
  • 3117