Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
All Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletter Hub
Free Learning
Arrow right icon
timer SALE ENDS IN
0 Days
:
00 Hours
:
00 Minutes
:
00 Seconds

Tech Guides

851 Articles
article-image-these-are-the-best-machine-learning-conferences-in-2018
Richard Gall
12 Jun 2018
8 min read
Save for later

7 of the best machine learning conferences for the rest of 2018

Richard Gall
12 Jun 2018
8 min read
We're just about half way through the year - scary, huh? But there's still time to attend a huge range of incredible machine learning conferences in 2018. Given that in this year's Skill Up survey developers working every field told us that they're interested in learning machine learning, it will certainly be worth your while (and money). We fully expect this year's machine learning conference circuit to capture the attention of those beyond the analytics world. The best machine learning conferences in 2018 But which machine learning conferences should you attend for the rest of the year? There's a lot out there, and they're not always that cheap. Let's take a look at 10 of the best machine learning conferences for the rest of this year. AI Summit London When and where? June 12-14 2018, Kensington Palace and ExCel Center, London, UK. What is it? AI Summit is all about AI and business - it's as much for business leaders and entrepreneurs as it is for academics and data scientists. The summit covers a lot of ground, from pharmaceuticals to finance to marketing, but the main idea is to explore the incredible ways Artificial Intelligence is being applied to a huge range of problems. Who is speaking? According to the event's website, there are more than 400 speakers at the summit. The keynote speakers include a number of impressive CEOs including Patrick Hunger, CEO of Saxo Bank and Helen Vaid, Global Chief Customer Officer of Pizza Hut. Who's it for? This machine learning conference is primarily for anyone who would like to consider themselves a thought leader. Don't let that put you off though, with a huge number of speakers from across the business world it is a great opportunity to see what the future of AI might look like. ML Conference, Munich When and where? June 18-10, 2018, Sheraton Munich Arabella Park Hotel, Munich, Germany. What is it? Munich's ML Conference is also about the applications of machine learning in the business world. But it's a little more practical-minded than AI Summit - it's more about how to actually start using machine learning from a technological standpoint. Who is speaking? Speakers at ML Conference are researchers and machine learning practitioners. Alison Lowndes from NVIDIA will be speaking, likely offering some useful insight on how NVIDIA is helping make deep learning accessible to businesses; Christian Petters, solutions architect at AWS will also be speaking on the important area of machine learning in the cloud. Who's it for? This is a good conference for anyone starting to become acquainted with machine learning. Obviously data practitioners will be the core audience here, but sysadmins and app developers starting to explore machine learning would also benefit from this sort of machine learning conference. O'Reilly AI Conference, San Francisco When and where? September 5-7 2018, Hilton Union Square, San Francisco, CA. What is it? According to O'Reilly's page for the event, this conference is being run to counter those conferences built around academic AI research. It's geared (surprise, surprise) towards the needs of businesses. Of course, there's a little bit of aggrandizing marketing spin there, but the idea is fundamentally a good one. It's all about exploring how cutting edge AI research can be used by businesses. It's somewhere between the two above - practical enough to be of interest to engineers, but with enough blue sky scope to satisfy the thought leaders. Who is speaking? O'Reilly have some great speakers here. There's someone else making an appearance for NVIDIA - Gaurav Agarwal, who's heading up the company's automated vehicles project. There's also Sarah Bird from Facebook who will likely have some interesting things to say about how her organization is planning to evolve its approach to AI over the years to come. Who is it for? This is for those working at the intersection of business and technology. Data scientists and analysts grappling with strategic business questions, CTOs and CMOs beginning to think seriously about how AI can change their organization will all find something here. O'Reilly Strata Data Conference, New York When and where? September 12-13, 2018, Javits Centre, New York, NY. What is it? O'Reilly's Strata Data Conference is slightly more Big Data focused than its AI Conference. Yes it will look at AI and deep learning, but it's going to tackle those areas from a big data perspective first and foremost. It's more established than the AI Summit (it actually started back in 2012 as Strata + Hadoop World), so there's a chance it will have a slightly more conservative vibe. That could be a good or bad thing, of course. Who is speaking? This is one of the biggest Big Data conferences on the planet, As you'd expect the speakers are from some of the biggest organizations in the world, from Cloudera to Google and AWS. There's a load of names we could pick out, but one we're most excited about is Varant Zanoyan from Airbnb who will be talking about Zipline, Airbnb's new data management platform for machine learning. Who's it for? This is a conference for anyone serious about big data. There's going to be a considerable amount of technical detail here, so you'll probably want to be well acquainted with what's happening in the big data world. ODSC Europe 2018, London When and where? September 19-22, Novotel West, London, UK. What is it? The Open Data Science Conference is very much all about the open source communities that are helping push data science, machine learning and AI forward. There's certainly a business focus, but the event is as much about collaboration and ideas. They're keen to stress how mixed the crowd is at the event. From data scientists to web developers, academics and business leaders, ODSC is all about inclusivity. It's also got a clear practical bent. Everyone will want different things from the conference, but learning is key here. Who is speaking? ODSC haven't yet listed speakers on their website, simply stating on their website "our speakers include some of the core contributors to many open source tools, libraries, and languages". This indicates the direction of the event - community driven, and all about the software behind it. Who's it for? More than any of the other machine learning conferences listed here, this is probably the one that really is for everyone. Yes, it might be a more technical than theoretical, but it's designed to bring people into projects. Speakers want to get people excited, whether they're an academic, app developer or CTO. MLConf SF, San Francisco When and where? November 14 2018, Hotel Nikko, San Francisco, CA. What is it? MLConf has a lot in common with ODSC. The focus is on community and inclusivity rather than being overtly corporate. However, it is very much geared towards cutting edge research from people working in industry and academia - this means it has a little more of a specialist angle than ODSC. Who is speaking? At the time of writing, MLConf are on the look out for speakers. If you're interested, submit an abstract - guidelines can be found here. However, the event does have Uber's Senior Data Science Manager Franzisca Bell scheduled to speak, which is sure to be an interesting discussion on the organization's current thinking and challenges with huge amounts of data at its disposal. Who's it for? This is an event for machine learning practitioners and students. Level of expertise isn't strictly an issue - an inexperienced data analyst could get a lot from this. With some key figures from the tech industry there will certainly be something for those in leadership and managerial positions too. AI Expo, Santa Clara When and where? November 28-29, 2018, Santa Clara Convention Center, Santa Clara, CA. What is it? Santa Clara's AI Expo is one of the biggest machine learning conferences. With four different streams, including AI technologies, AI and the consumer, AI in the enterprise, and Data analytics for AI and IoT, the event organizers are really trying to make their coverage pretty comprehensive. Who is speaking? The event's website boasts 75+ speakers. The most interesting include Elena Grewel, Airbnb's Head of Data Science, Matt Carroll, who leads developer relations at Google Assistant, and LinkedIn's Senior Director of Dara Science, Xin Fu. Who is it for? With so much on offer this has wide appeal. From marketers to data analysts, there's likely to be something on offer. However, with so much going on you do need to know what you want to get out of an event like this - so be clear on what AI means to you and what you want to learn. Did we miss an important machine learning conference? Are you attending any of these this year? Let us know in the comments - we'd love to hear from you.
Read more
  • 0
  • 0
  • 2697

article-image-risk-wearables-how-secure-your-smartwatch
Sam Wood
10 Jun 2016
4 min read
Save for later

The Risk of Wearables - How Secure is Your Smartwatch

Sam Wood
10 Jun 2016
4 min read
Research suggests we're going to see almost 700 million smartwatches and wearable units shipped to consumers over the next few years. Wearables represent an exciting new frontier for developers - and a potential new cyber security risk. Smartwatches record a surprisingly large amount of data, and that data often isn't very secure. What data do smartwatches collect? Smartwatches are stuffed full of sensors, to monitor your body and the world around you. A typical smartwatch might include any of the following: Gyroscope Accelerometer Light detection Heart rate monitor GPS Pedometer Through SDKs like Apple's ResearchKit, or through firmware like on the FitBit apps can be created to allow a wearable to monitor and collect this very physical personal data. These data collection is benign and useful - but encompasses some very personal parts of an individuals life such as health, daily activities, and even sleeping patterns. So is it secure? Where is the data stored and how can hackers access it? Smart wearables almost always link up to another 'host' device and that device is almost always a mobile phone. Data from wearables is stored and analysed on that host device, and is in turn vulnerable to the myriad of attacks that can be undertaken against mobile devices. Potential attacks include: Direct USB Connection: Physically linking your wearable with a USB port, either after theft or with a fake charging station. Think it's unlikely? So called 'juice jacking' is more common than you might think. WiFi, Bluetooth and Near Field Communication: Wearables are made possible by wireless networks, whether Bluetooth, WiFi, or NFC. This makes them especially vulnerable to the myriad of wireless attacks it is possible to execute - even something a simple as rooting a device over WiFi with SSH. Malware and Web-based Attacks: Mobile devices remain highly vulnerable to attacks from malware and web-based attacks such as StageFright. Why is this data a security risk? You might be thinking "What do I care if some hacker knows how much I walk during the day?" But access to this data has some pretty scary implications. Our medical records are sealed tight for a reason - do you really want a hacker to be able to intuit the state of your health from your heart rate and exercise? What about if they then sell that data to your medical insurer? Social engineering is one of the most used tools of anyone seeking access to a secure system or area. Knowing how a person slept, where they work out, when their heart rate has been elevated - even what sort of mood they might be in - all makes it that much easier for a hacker to manipulate human weakness. Even if we're not a potential gateway into a highly secured organization, this data can be hypothetically used by dodgy advertisers and products to target us when we're at our most vulnerable. For example, 'Freemium' games often have highly sophisticated models for when to push their paid content and turn us into 'whales' who are always buying their product. Access to elements of our biometrics would only make this that much easier. What does this mean? As our lives integrate more and more with information technology, our data moves further and further outside of our own control. Wearables mean the recording of some of our most intimate details - and putting that data at risk in turn. Even when we work to keep it secure, it only takes one momentary lapse to put it at risk to anyone who's ever been interested in seeing it. Information security is only going to get more vital to all of us. Acknowledgements This blog is based on the presentation delivered by Sam Phelps at the Security BSides London 2016.
Read more
  • 0
  • 0
  • 2696

article-image-blockchain-iot-security
Savia Lobo
29 Aug 2017
4 min read
Save for later

How Blockchain can level up IoT Security

Savia Lobo
29 Aug 2017
4 min read
IoT contains hoard of sensors, vehicles and all devices that have embedded electronics which can communicate over the Internet. These IoT enabled devices generate tons of data every second. And with IoT Edge Analytics, these devices are getting much smarter - they can start or stop a request without any human intervention. 25 billion connected ”things” will be connected to the internet by 2020. - Gartner Research With so much data being generated by these devices, the question on everyone’s mind is: Will all this data be reliable and secure? When Brains meet Brawn: Blockchain for IoT Blockchain, an open distributed ledger, is highly secure and difficult to manipulate/corrupt by anyone connected over the network. It was initially designed for cryptocurrency based financial transactions. Bitcoin is a famous example which has Blockchain as its underlying technology. Blockchain has come a long way since then and can now be used to store anything of value. So why not save data in it? And this data will be secure just like every digital asset in a Blockchain is. Blockchain, decentralized and secured, is an ideal structure suited to form the underlying foundation for IoT data solutions. Current IoT devices and their data rely on the client service architecture. All devices are identified, authenticated, and connected via the cloud servers, which are capable of storing ample amount of data. But this requires huge infrastructure, which is all the more expensive. Blockchain not only provides an economical alternative but also since it works in a decentralized fashion it eliminates all single point of failures, creating a much secure and tougher network for IoT devices. This makes IoT more secure and reliable. Customers can therefore relax knowing their information is in safe hands. Today, Blockchain’s capabilities extend beyond processing financial transactions - It can now track billions of connected devices, process transactions and even co-ordinate between devices - a good fit for the IoT industry. Why Blockchain is perfect for IoT Inherent weak security features make IoT devices suspect. On the other hand, Blockchain with its tamper-proof ledger makes it hard to manipulate for malicious activities - thus, making it the right infrastructure for IoT solutions. Enhancing security through decentralization Blockchain makes it hard for intruders to intervene as it spans across a network of secure blocks. Change at a single location, therefore, does not affect the other blocks. The data or any value remains encrypted and is only visible to the person who has encrypted it using a private key. The cryptographic algorithms used in Blockchain technology ensure the IoT data remain private either for an individual organization or for the organizations connected in a network . Simplicity through autonomous 3rd-party-free transactions Blockchain technology is already a star in the finance sector thanks to the adoption of smart contracts, Bitcoin and other cryptocurrencies. Apart from providing a secured medium for financial transactions, it eliminates the need for third-party brokers such as banks to provide guarantee over peer-to-peer payment services. With Blockchain, IoT data can be treated in a similar manner, wherein smart contracts can be made between devices to exchange messages and data. This type of autonomy is possible because each node in the blockchain network can verify the validity of the transaction without relying on a centralized authority. Blockchain backed IoT solutions will thus enable trustworthy message sharing. Business partners can easily access and exchange confidential information within the IoT without a centralized management/regulatory authority. This means quicker transactions, lower costs and lesser opportunities for malicious intent such as data espionage. Blockchain's immutability for predicting IoT security vulnerabilities Blockchains maintain a history of all transactions made by smart devices connected within a particular network. This is possible because once you enter data in a Blockchain, it lives there forever in its immutable ledger. The possibilities for IoT solutions that leverage Blockchain’s immutability are limitless. Some obvious uses cases are more robust credit-scores and preventive health-care solutions that use data accumulated through wearables. For all the above reasons, we see significant Blockchain adoption by IoT based businesses in the near future.
Read more
  • 0
  • 0
  • 2685

article-image-technical-debt-is-damaging-businesses
Richard Gall
11 Jun 2018
5 min read
Save for later

Technical debt is damaging businesses

Richard Gall
11 Jun 2018
5 min read
A lot of things make working in tech difficult. Technical debt is one of them. Whether you're working in-house or for an external team, you've probably experienced some tough challenges when it comes to legacy software. Most people have encountered strange internal software systems, a CMS that has been customized in a way that no one has the energy to fathom. Working your way around and through these can be a headache to say the least. In this year's Skill Up survey, we found that Technical debt and legacy issues are seen by developers as the biggest barrier to business goals. According to 49% of respondents, old technology and software is stopping organizations from reaching their full potential. But it might also be stopping developers from moving forward in their careers. Read the report in full. Sign up to our newsletter and download the PDF for free. Technical debt and the rise of open source Arguably, issues around technical debt have become more pronounced in the last decade as the pace of technical change has seemingly increased. I say seemingly, because it's not so much that we're living in an entirely new technical landscape. It's more that the horizons of that landscape are expanding. There are more possibilities and options open to businesses today. Technology leadership is difficult in 2018. To do it well, you need to stay on top of new technologies. But you also need a solid understanding of your internal systems, your team, as well as wider strategic initiatives and business goals. There are a lot of threads you need to manage. Are technology leaders struggling with technical debt? Perhaps technology leaders are struggling. But perhaps they're also making the best of difficult situations. When you're juggling multiple threads in the way I've described, you need to remain focused on what's important. Ultimately, that's delivering software that delivers value. True, your new mobile app might not be ideal; the internal CMS you were building for a client might not offer an exemplary user experience. But it still does the job - and that, surely is the most important thing? We can do better - let's solve technical debt together It's important to be realistic. In the age of burn out and over work, let's not beat ourselves up when things aren't quite what we want. Much of software engineering is, after all, making the best of a bad situation. But the solutions to technical debt can probably be found in a cultural shift. The lack of understanding of technology on the part of management is surely a large cause of technical debt. When projects aren't properly scoped and when deadlines are set without a clear sense of what level of work is required, that's when legacy issues begin to become a problem. In fact, it's worth looking at all the other barriers. In many ways, they are each a piece of the puzzle if we are to use technology more effectively - more imaginatively - to solve business problems. Take these three: Lack of quality training or learning Team resources Lack of investment in projects All of these point to a wider cultural problem with the way software is viewed in businesses. There's no investment, teams are under-resourced, and support to learn and develop new skills is simply not being provided. With this lack of regard for software, it's unsurprising that developers are spending more time solving problems on, say, legacy code, than solving big, interesting problems. Ones that might actually have a big impact. One way of solving technical debt, then, is to make a concerted effort to change the cultural mindset. Yes, some of this will need to come from senior management, but all software engineers need to take responsibility. This means better communication and collaboration, a commitment to documentation - those things that are so easy to forget to do well when you could be shipping code. What happens if we don't start solving technical debt Technical debt is like global warming - it's happening already. We feel the effects every day. However, it's only going to get worse. Yes, it's going to damage businesses, but it's also going to hurt developers. It's restricting the scope of developers to do the work they want to do and make a significant impact on their businesses. It seems as though we're locked in a strange cycle where businesses talk about the importance of 'digital skills' and technical knowledge gaps but ironically can't offer the resources or scope for talented developers to actually do their job properly. Developers bring skills, ideas, and creativity to jobs only to find that they're isn't really time to indulge that creativity. "Maybe next year, when we have more time" goes the common refrain. There's never going to be more time - that's obvious to anyone who's ever had a job, engineer or otherwise. So why not take steps to start solving technical debt now? Read next 8 Reasons why architects love API driven architecture Python, Tensorflow, Excel and more – Data professionals reveal their top tools The best backend tools in web development
Read more
  • 0
  • 0
  • 2684

article-image-progressive-web-amps-combining-progressive-web-apps-and-amp
Sugandha Lahoti
14 Jun 2018
8 min read
Save for later

Progressive Web AMPs: Combining Progressive Wep Apps and AMP

Sugandha Lahoti
14 Jun 2018
8 min read
Modern day web development is getting harder. Users are looking for relentless, responsive and reliable browsing. They want faster results and richer experiences. In addition to this, Modern apps need to be designed so as to support a large number of ecosystems from mobile web, desktop web, Native ioS, Native Android, Instant articles etc. Every new technology which launches has its own USP. The need for today is combining the features of the various popular mobile tech in the market and reaping their benefits as a combination. Acknowledging the standalones In a study by google it was found that “53% of mobile site visits are abandoned if pages take longer than 3 seconds to load.” This calls for making page loads faster and effortless. A cure for this illness is in the form of AMP or Accelerated Mobile Pages, the brainwork of Google and Twitter. They are blazingly fast web pages purely meant for readability and speed. Essentially they are HTML, most of CSS, but no JavaScript. So heavy duty things such as images are not loaded until they are scrolled into view. In AMPs, links are pre-rendered before you click on them. This is made possible using the AMP caching infrastructure. It automatically caches and calls on the content to be displayed atop the AMP and that is why it feels instant. Because the developers almost never write JavaScript, it leads to a cheap, yet fairly interactive deployment model. However, AMPs are useful for a narrow range of content. They have limited functionality. Users, on the other hand, are also looking for reliability and engagement. This called for the development of what is known as Progressive web apps. Proposed by Google in 2015, PWAs combine the best of mobile and web applications to offer users an enriching experience. Think of Progressive web apps as a website that acts and feels like a complete app. Once the user starts exploring the app within the browser, it progressively becomes smarter, faster and makes user experience richer.  Application Shell Architecture and Service Workers are two core drivers that enable PWA to offer speed and functionality. Key benefits that PWA offers over traditional mobile sites include push notifications, highly responsive UI, all types of hardware access which includes access to camera & microphones, and low data usage to name a few. The concoction: PWA + AMP AMPs are fast and easy to deploy. PWAs are engaging and reliable. AMPs are effortless, more retentive and instant. PWAs supports dynamic content, provides push notifications and web manifests. AMPs work on user acquisition. PWAs enhance user experiences. They seemingly work perfectly well on different levels. But users want to Start quick and Stay quick. They want the content they view to make the first hop blazingly fast, but then provide richer pages by amazing reliability and engagement. This called for combining the features of both into one and this was how Progressive web AMPs was born. PWAMP, as the developers call it, combines the capabilities of native app ecosystem with the reach of the mobile Web. Let us look at how exactly it functions and does the needful. The Best of Both Worlds: Reaping benefits of both AMPs fall back when you have dynamic content. Lack of JavaScript means dynamic functionality such as Payments, or push notifications are unavailable. PWA, on the other hand, can never be as fast as an AMP on the first click. Progressive Web AMPs combines the best features of both by making the first click super fast and then rendering subsequent PWA pages/content. So AMP opens a webpage in the blink of an eye with zero time lag and then the subsequent swift transition to PWA leads to beautiful results with dynamic functionalities. So it starts fast and builds up as users browse further. Now, this merger is made possible using three different ways. AMP as PWA: AMP pages in combination with PWA features This involves enabling PWA features in AMP pages. The user clicks on the link, it boots up fast and you see an AMP page which loads from the AMP cache. On clicking subsequent links, the user moves away from AMP cache to the site’s domain(origin). The website continues using the AMP library, but because it is supported on origin now, service workers become active, making it possible to prompt users (by web manifests) to install a PWA version of their website for a progressive experience. AMP to PWA: AMP pages utilized for a smooth transition to PWA features In PWAs the service workers and app shells kick in only after the second step. Hence AMPs can be a perfect entry point for your apps whereas the user discovers content at fast rates with AMP pages, the service worker of the PWA installs in the background and the user is instantly upgraded to PWA in subsequent clicks which can add push notifications, add reminders, web manifests etc. So basically the next click is also going to be instant. AMP in PWA: AMP as a data source for PWA AMPs are easy and safe to embed. As they are self-contained units, they are easily embeddable in websites. Hence they can be utilized as a data source for PWAs.  AMPs make use of Shadow AMP, which can be introduced in your PWA. This AMP library, loads in the top level page. It can amplify the portions in the content as decided by the developer and connect to a whole load of documents for rendering them out. As the AMP library is compiled and loaded only once for, the entire PWA, it would, in turn, reduce backend implementations and client complexity. How are they used in the real world scenario: Shopping PWAMP offers a high engagement feature to the shoppers. Because AMP sites are automatically kept at the top by Google search engines, AMP attracts the customers to your sites by the faster discovery of the apps. The PWA keeps them thereby allowing a rich, immersive, and app-like shopping experience that keeps the shoppers engaged. Lancôme, the L’Oréal Paris cosmetics brand is soon combining AMP with their existing PWA. Their PWA had led to a 17% year over year increase in the mobile sales. With the addition of AMP, they aim to build lightweight mobile pages that load as fast as possible on smartphones to make the site faster and more engaging. Travel PWAMP features allow users to browse through a list of hotels which instantly loads up at the first click. The customer may then book a hotel of his choice in the subsequent click which upgrades him to the PWA experience. Wego, is a Singapore-based travel service. Its PWAMP has achieved a load time for new users at 1.6 seconds and 1 second for returning customers. This has helped to increase site visits by 26%, reduce bounce rates by 20% and increase conversions by 95%, since its launch. News and Media Progressive Web AMPs are also highly useful in the news apps. As the user engages with content using AMP, PWA downloads in the background creating frictionless, uninterrupted reading. Washington Post has come up with one such app where users can experience the Progressive Web App when reading an AMP article and clicking through to the PWA link when it appears in the menu. In addition, their PWA icon can be added to a user’s home screen through the phone’s browser. All the above examples showcase how the concoction proves to always be fast no matter what. Progressive Web AMPs are progressively enhanced with just one backend-the AMP to rule them all meaning that deploy targets are reduced considerably. So all ecosystems namely web, Android, and iOS are supported with just thin layers of extra code. Thus making them highly beneficial in cases of constrained engineering resources or reduced infrastructure complexity. In addition to this, Progressive Web AMPs are highly useful when a site has a lot of static content on individual pages, such as travel, media, news etc. All these statements assert the fact that PWAMP has the power to provide a full mobile web experience with an artful and strategic combination of the AMP and PWA technologies. To know more about how to build your own Progressive Web AMPs, you can visit the official developer’s website. Top frameworks for building your Progressive Web Apps (PWA) 5 reasons why your next app should be a PWA (progressive web app) Build powerful progressive web apps with Firebase
Read more
  • 0
  • 0
  • 2678

article-image-my-friend-the-robot-artificial-intelligence-needs-emotional-intelligence
Aaron Lazar
21 Feb 2018
8 min read
Save for later

My friend, the robot: Artificial Intelligence needs Emotional Intelligence

Aaron Lazar
21 Feb 2018
8 min read
Tommy’s a brilliant young man, who loves programming. He’s so occupied with computers that he hardly has any time for friends. Tommy programs a very intelligent robot called Polly, using Artificial Intelligence, so that he has someone to talk to. One day, Tommy gets hurt real bad about something and needs someone to talk to. He rushes home to talk to Polly and pours out his emotions to her. To his disappointment, Polly starts giving him advice like she does for any other thing. She doesn’t understand that he needs someone to “feel” what he’s feeling rather than rant away on what he should or shouldn’t be doing. He naturally feels disconnected from Polly. My Friend doesn’t get me Have you ever wondered what it would be like to have a robot as a friend? I’m thinking something along the lines of Siri. Siri’s pretty good at holding conversations and is quick witted too. But Siri can’t understand your feelings or emotions, neither can “she” feel anything herself. Are we missing that “personality” from the artificial beings that we’re creating? Even if you talk about chatbots, although we gain through convenience, we lose the emotional aspect, especially at a time when expressive communication is the most important. Do we really need it? I remember watching the Terminator, where Arnie asks John, “Why do you cry?” John finds it difficult to explain to him, why humans cry. The fact is though, that the machine actually understood there was something wrong with the human, thanks to the visual effects associated with crying. We’ve also seen some instances of robots or AI analysing sentiment through text processing as well. But how accurate is this? How would a machine know when a human is actually using sarcasm? What if John was faking it and could cry at the drop of a hat or he just happened to be chopping onions? That’s food for thought now. On the contrary, you might wonder though, do we really want our machines to start analysing our emotions? What if they take advantage of our emotional state? Well, that’s a bit of a far fetched thought and what we need to understand is that it’s necessary for robots to gauge a bit of our emotions to enhance the experience of interacting with them. There are several wonderful applications for such a technology. For instance, Marketing organisations could use applications that detect users facial expressions when they look at a new commercial to gauge their “interest”. It could also be used by law enforcement as a replacement to the polygraph. Another interesting use case would be to help autism affected individuals understand the emotions of others better. The combination of AI and EI could find a tonne of applications right from cars that can sense if the driver is tired or sleepy and prevent an accident by pulling over, to a fridge that can detect if you’re stressed and lock itself, to prevent you from binge eating! Recent Developments in Emotional Intelligence There are several developments happening from the past few years, in terms of building systems that understand emotions. Pepper, a Japanese robot, for instance, can tell feelings such as joy, sadness and anger, and respond by playing you a song. A couple of years ago, Microsoft released a tool, the Emotion API, that could breakdown a person’s emotions based only on their picture. Physiologists, Neurologists and Psychologists, have collaborated with engineers to find measurable indicators of human emotion that can be taught to computers to look out for. There are projects that have attempted to decode facial expressions, the pitch of our voices, biometric data such as heart rate and even our body language and muscle movements. Bronwyn van der Merwe, General Manager of Fjord in the Asia Pacific region revealed that big companies like Amazon, Google and Microsoft are hiring comedians and script writers in order to harness the human-like aspect of AI by inducing personality into their technologies. Jerry, Ellen, Chris, Russell...are you all listening? How it works Almost 40% of our emotions are conveyed through tone of voice and the rest is read through facial expressions and gestures we make. An enormous amount of data is collected from media content and other sources and is used as training data for algorithms to learn human facial expressions and speech. One type of learning used is Active Learning or human-assisted machine learning. This is a kind of supervised learning, where the learning algorithm is able to interactively query the user to obtain new data points or an output. Situations might exist where unlabeled data is plentiful but manually labeling the data is expensive. In such a scenario, learning algorithms can query the user for labels. Since the algorithm chooses the examples, the number of examples to learn a concept turns out to be lower than what is required for usual supervised learning. Another approach is to use Transfer Learning, a method that focuses on storing the knowledge that’s gained while solving one problem and then applying it to a different but related problem. For example, knowledge gained while learning to recognize fruits could apply when trying to recognize vegetables. This works by analysing a video for facial expressions and then transfering that learning to label speech modality. What’s under the hood of these machines? Powerful robots that are capable of understanding emotions would most certainly be running Neural Nets under the hood. Complementing the power of these Neural Nets are beefy CPUs and GPUs on the likes of the Nvidia Titan X GPU and Intel Nervana CPU chip. Last year at NIPS, amongst controversial body shots and loads of humour filled interactions, Kory Mathewson and Piotr Mirowski entertained audiences with A.L.Ex and Pyggy, two AI robots that have played alongside humans in over 30 shows. These robots introduce audiences to the “comedy of speech recognition errors” by blabbering away to each other as well as to humans. Built around a Recurrent Neural Network that’s trained on dialogue from thousands of films, A.L.Ex. communicates with human performers, audience participants, and spectators through speech recognition, voice synthesis, and video projection. A.L.E.x is written in Torch and Lua code and has a word vocabulary of 50,000 words that have been extracted from 102,916 movies and it is built on an RNN with Long-Short Term Memory architecture and 512 dimensional layers. The unconquered challenges today The way I see it, there are broadly 3 challenge areas that AI powered robots face in this day: Rationality and emotions: AI robots need to be fed with initial logic by humans, failing which, they cannot learn on their own. They may never have the level of rationality or the breadth of emotions to take decisions the way humans do. Intuition, strategic thinking and emotions: Machines are incapable of thinking into the future and taking decisions the way humans can. For example, not very far into the future, we might have an AI powered dating application that measures a subscriber’s interest level while chatting with someone. It might just rate the interest level lower, if the person is in a bad mood due to some other reason. It wouldn’t consider the reason behind the emotion and whether it was actually linked to the ongoing conversation. Spontaneity, empathy and emotions: It may be years before robots are capable of coming up with a plan B, the way humans do. Having a contingency plan and implementing it in an emotional crisis is something that AI fails at accomplishing. For example, if you’re angry at something and just want to be left alone, your companion robot might just follow what you say without understanding your underlying emotion, while an actual human would instantly empathise with your situation and rather try to be there for you. Bronwyn van der Merwe said, "As human beings, we have contextual understanding and we have empathy, and right now there isn't a lot of that built into AI. We do believe that in the future, the companies that are going to succeed will be those that can build into their technology that kind of an understanding". What’s in store for the future If you ask me, right now we’re on the highway to something really great. Yes, there are several aspects that are unclear about AI and robots making our lives easier vs disrupting them, but as time passes, science is fitting the pieces of the puzzle together to bring about positive changes in our lives. AI is improving on the emotional front as I write, although there are clearly miles to go. Companies like Affectiva are pioneering emotion recognition technology and are working hard to improve the way AI understands human emotions. Biggies like Microsoft had been working on bringing in emotional intelligence into their AI since before 2015 and have come a long way since then. Perhaps, in the next Terminator movie, Arnie might just comfort a weeping Sarah Connor, saying, “Don’t cry, Sarah dear, he’s not worth it”, or something of the sort. As a parting note and just for funsies, here’s a final question for you, “Can you imagine a point in the future when robots have such high levels of EQ, that some of us might consider choosing them as a partner over humans?”
Read more
  • 0
  • 0
  • 2676
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $15.99/month. Cancel anytime
article-image-the-trouble-with-smart-contracts
Guest Contributor
03 Jul 2018
6 min read
Save for later

The trouble with Smart Contracts

Guest Contributor
03 Jul 2018
6 min read
The government of Tennessee now officially recognizes Smart Contracts. That’s great news if we speak in terms of the publicity blockchain will receive. By virtue of such events, the Blockchain technology and all that’s related to it are drawing closer to becoming a standard way of how things work. However, the practice shows that the deeper you delve into the nuances of Blockchain, the more you understand that we are at the very beginning of quite a long and so far uncertain path. Before we investigate Smart Contracts on the back of a Tennessee law, let’s look at the concept in lay terms. Traditional Contract vs Smart Contract A traditional contract is simply a notarized piece of paper that details actions that are to be performed under certain conditions. It doesn’t control the actions fulfillment, but only assures it. Smart Contract is just like a paper contract; it specifies the conditions. Along with that, since a smart contract is basically a program code, it can carry out actions (which is impossible when we deal with the paper one). Most typically, smart contracts are executed in a decentralized environment, where: Anyone can become a validator and verify the authenticity of correct smart contract execution and the state of the database. Distributed and independent validators supremely minimize the third-party reliance and give confidence concerning unchangeability of what is to be done. That’s why, before putting a smart contract into action you should accurately check it for bugs. Because you won’t be able to make changes once it’s launched. All assets should be digitized. And all the data that may serve as a trigger for smart contract execution must be located within one database (system). What are oracles? There’s a popular myth that smart contracts in Ethereum can take external data from the web and use it in their environment (for example, smart contract transfers money to someone who won the bet on a football match results). You can not do that, because a smart contract only relies on the data that’s on the Ethereum blockchain. Still, there is a workaround. The database (Ethereum’s, in our case) can contain so-called oracles — ‘trusted’ parties that collect data from ‘exterior world’ and deliver it to smart contracts. For more precision, it is necessary to choose a wide range of independent oracles that provide smart contract with information. This way, you minimize the risk of their collusion. Smart Contract itself is only a piece of code For a better understanding, take a look at what Pavel Kravchenko — Founder of Distributed Lab has written about Smart Contracts on his Medium post: “A smart contract itself is a piece of code. The result of this code should be the agreement of all participants of the system regarding account balances (mutual settlements). From here indirectly it follows that a smart contract cannot manage money that hasn’t been digitized. Without a payment system that provides such opportunity (for example, Bitcoin, Ethereum or central bank currency), smart contracts are absolutely helpless!” Smart Contracts under the Tennessee law Storing data on the blockchain is now a legit thing to do in Tennessee. Here are some of the primary conditions stipulated by the law: Records or contracts secured through the blockchain are acknowledged as electronic records. Ownership rights of certain information stored on blockchain must be protected. Smart Contract is considered as an event-driven computer program, that’s executed on an electronic, distributed, decentralized, shared, and replicated ledger that is used to automate transactions. Electronic signatures and contracts secured through the blockchain technologies now have equal legal standing with traditional types of contracts and signatures. It is worth noting that the definition of a smart contract is pretty clear and comprehensive here. But, unfortunately, it doesn’t let the matter rest and there are some questions that were not covered: How can smart contracts and the traditional ones have equal legal standings if the functionality of a smart contract is much broader? Namely, it performs actions, while traditional contract only assures them. How will asset digitization be carried out? Do they provide any requirements for the Smart Contract source code or some normative audit that is to be performed in order to minimize bugs risk? The problem is not with smart contracts, but with creating the ecosystem around them. Unfortunately, it is impossible to build uniform smart-contract-based relationships in our society simply because the regulator has officially recognized the technology. For example, you won’t be able to sell your apartment via Smart Contract functionality if there won’t be a regulatory base that considers: The specified blockchain platform on which smart contract functionality is good enough to sustain a broad use. The way assets are digitized. And it’s not only for digital money transactions that you will be using smart contracts. You can use smart contracts to store any valuable information, for example, proprietary rights on your apartment. Who can be the authorized party/oracle that collects the exterior data and delivers it to the Smart Contract (Speaking of apartments, it is basically the notary, who should verify such parameters as ownership of the apartment, its state, even your existence, etc) So, it’s true. A smart contract itself is a piece of code and objectively is not a problem at all. What is a problem, however, is preparing a sound basis for the successful implementation of Smart Contracts in our everyday life. Create and launch a mechanism that would allow the connection of two entirely different gear wheels: smart contracts in its digital, decentralized and trustless environment the real world, where we mostly deal with the top-down approach and have regulators, lawyers, courts, etc. FAE (Fast Adaptation Engine): iOlite’s tool to write Smart Contracts using machine translation Blockchain can solve tech’s trust issues – Imran Bashir A brief history of Blockchain About the Expert, Dr. Pavel Kravchenko Dr. Pavel Kravchenko is the Founder of Distributed Lab, blogger, cryptographer and Ph.D. in Information Security. Pavel is working in blockchain industry since early 2014 (Stellar). Pavel's expertise is mostly focused on cryptography, security & technological risks, tokenization. About Distributed Lab Distributed Lab is a blockchain expertise center, with a core mission to develop cutting-edge enterprise tokenization solutions, laying the groundwork for the coming “Financial Internet”. Distributed Lab organizes dozens of events every year for the Crypto community – ranging from intensive small-format meetups and hackathons to large-scale international conferences which draw 1000+ attendees.  
Read more
  • 0
  • 0
  • 2670

article-image-align-your-product-experience-strategy-with-business-needs
Packt Editorial Staff
02 May 2018
10 min read
Save for later

Align your product experience strategy with business needs

Packt Editorial Staff
02 May 2018
10 min read
Build a product experience strategy around the needs of stakeholders Product experience strategists need to conduct thorough research to ensure that the products being developed and launched align with the goals and needs of the business. Alignment is a bit of a buzzword that you're likely to see in HBR and other publications, but don't dismiss it - it isn't a trivial thing, and it certainly isn't an abstract thing. One of the pitfalls of product experience strategy - and product management more generally - is that understanding the needs of the business isn't actually that straightforward. There's lots of moving parts, lots of stakeholders. And while everyone should be on the same page, even subtle differences can make life difficult. This is why product experience strategists do detailed internal research. It: Helps designers to understand the company's vision and objectives for the product. It allows them to understand what's at stake. Based on this, they work with stakeholders to align product objectives and reach a shared understanding on the goals of design. Once organizational alignment is achieved, the strategist uses research insights to develop a product experience strategy. The research is simply a way of validating and supporting that strategy. The included research activities are: Stakeholder and subject-matter expert (SME) interviews Documents review Competitive research Expert product reviews   Talk to key stakeholders Stakeholders are typically senior executives who have a direct responsibility for, or influence on, the product. Stakeholders include product managers, who manage the planning and day-to-day activities associated with their product, and have a direct decision-making authority over its development. In projects that are important to the company, it is not uncommon for the executive leadership from the chief executive and down to be among the stakeholders due to their influence and authority to the direct overall product strategy. The purpose of stakeholder interviews is to gather and understand the perspective of each individual stakeholder and align the perspectives of all stakeholders around a unified vision around the scope, purpose, outcomes, opportunities and obstacles involved in undertaking a new product development project. Gaps among stakeholders on fundamental project objectives and priorities, will lead to serious trouble down the road. It is best to surfaces such deviations as early as possible, and help stakeholders reach a productive alignment. The purpose of subject-matter experts (SMEs) interviews is to balance the strategic high- level thinking provided by stakeholders, with detailed insights of experienced employees who are recognized for their deep domain expertise. Sales, customer service, and technical support employees have a wealth of operational knowledge of products and customers, which makes them invaluable when analyzing current processes and challenges. Prior to the interviews, the experience strategist prepares an interview guide. The purpose of the guide is to ensure the following: All stakeholders can respond to the same questions All research topics are covered if interviews are conducted by different interviewers Interviews make the best use of stakeholders' valuable time Some of the questions in the guide are general and directed at all participants, others are more specific and focus on the stakeholders specific areas of responsibility. Similar guides are developed for SME interviews. In-person interviews are the best, because they take place at the onset of the project and provide a good opportunity to build rapport and trust between the designer and interviewee. After a formal introduction regarding the purpose of the interview and general questions regarding the person's role and professional experience, the person is asked for their personal assessment and opinions on various topics. Here is a sample of different topics: Objectives and obstacles Prioritized goals for the project What does success look like What kind of obstacles the project is facing, and suggestions to overcome them Competition Who are your top competitors Strength and weaknesses relative to the competition Product features and functionality Which features are missing Differentiating features Features to avoid The interviews are designed to last no more than an hour and are documented with notes and audio recordings, if possible. The answers are compiled and analyzed and the result is presented in a report. The report suggests a unified list of prioritized objectives, and highlights gaps and other risks that have been reported. The report is one of the inputs into the development of the overall product experience strategy. Experts understand product experience better than anyone Product expert reviews, sometimes referred to as heuristic evaluations, are professional assessments of a current product, which are performed by design experts for the purpose of identifying usability and user experience issues. The thinking behind the expert review technique is very practical. Experience designers have the expertise to assess the experience quality of a product in a systematic way, using a set of accepted heuristics. A heuristic is a rule of thumb for assessing products. For example, the error prevention heuristic deals with how well the evaluated product prevents the user from making errors. The word heuristic often raises questions about its meaning, and the method has been criticized for its inherent weaknesses due to the following: Subjectivity of the evaluator Expertise and domain knowledge of the evaluator Cultural and demographic background of the evaluator These weaknesses increase the probability that the outcome of an expert evaluation will reflect the biases and preferences of the evaluator, resulting in potentially different conclusions about the same product. Still, expert evaluations, especially if conducted by two evaluators, and their aligned findings, have proven to be an effective tool for experience practitioners who need a fast and cost-effective assessment of a product, particularly digital interfaces. Jacob Nielsen developed the method in the early 1990s. Although there are other sets of heuristics, Nielsen's are probably the most known and commonly used. His initial set of heuristics was first published in his book, Usability Engineering, and is brought here verbatim, as there is no need for modification: Visibility of system status: The system should always keep users informed about what is going on, through appropriate feedback within reasonable time. Match between system and the real world: The system should speak the user's language, with words, phrases and concepts familiar to the user, rather than system-oriented terms. Follow real-world conventions, making information appear in a natural and logical order. User control and freedom: Users often choose system functions by mistake and will need a clearly marked "emergency exit" to leave the unwanted state without having to go through an extended dialogue. Support undo and redo. Consistency and standards: Users should not have to wonder whether different words, situations, or actions mean the same thing. Follow platform conventions. Error prevention: Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present users with a confirmation option before they commit to the action. Recognition rather than recall: Minimize the user's memory load by making objects, actions, and options visible. The user should not have to remember information from one part of the dialogue to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate. Flexibility and efficiency of use: Accelerators--unseen by the novice user--may often speed up the interaction for the expert user such that the system can cater to both inexperienced and experienced users. Allow users to tailor frequent actions. Aesthetic and minimalist design: Dialogues should not contain information which is irrelevant or rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility. Help users recognize, diagnose, and recover from errors: Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution. Help and documentation: Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on the user's task, list concrete steps to be carried out, and not be too large. Every product experience strategy needs solid competitor research Most companies operate in a competitive marketplace, and having a deep understanding of the competition is critical to the success and survival. Here are few of the questions that a competitive research helps addresses: How does a product or service compare to the competition? What are the strength and weaknesses of competing offerings? What alternatives and choices does the target audience have? Experience strategists use several methods to collect and analyze competitive information. From interviews with stakeholder and SMEs, they know who the direct competition is. In some product categories, such as automobiles and consumer products, companies can reverse-engineer competitive products and try to match or surpass their capabilities. Additionally, designers can develop extensive experience analysis of such competitive products, because they can have a first-hand experience with it. With some hi-tech products, however, some capabilities are cocooned within proprietary software or secret production processes. In these cases, designers can glean the capabilities from an indirect evidence of use. The Internet is a main source of competitive information, from the ability to have a direct access to a product online, to reading help manuals, user guides, bulletin boards, reviews, and analysis in trade publications. Occasionally, unauthorized photos or documents are leaked to the public domain, and they provide clues, sometimes real and sometimes bogus, about a secret upcoming product. Social media too is an important source of competitive data in the form of customers reviews on Yelp, Amazon, or Facebook. With the wealth of this information, a practical strategy to surpass the competition and delivering a better experience can be developed. For example, Uber has been a favorite car hailing service for a while. This service has also generated public controversy and had dissatisfied riders and drivers who are not happy with its policies, including its resistance for tips. By design, a tipping function is not available in the app, which is the primary transaction method between the rider, company and, driver. Research indicates, however, that tipping for the service is a common social norm and that most people tip because it makes them feel better. Not being able to tip places riders in an uncomfortable social setting and stirs negative emotions against Uber. The evidence of dissatisfaction can be easily collected from numerous web sources and from interviewing actual riders and drivers. For Uber competitors, such as Lyft and Curb, by making tipping an integrated part of their apps, provides an immediate competitive edge that improves the experience of both riders, who have an option to reward the driver for their good service, and drivers, who benefit from an increased income. This, and additional improvements over the inferior Uber experience, become a part of an overall experience strategy that is focused on improving the likelihood that riders and drivers will dump Uber in their favor. [box type="note" align="" class="" width=""]You read an extract from the book Exploring Experience Design written by Ezra Schwartz. This book will help you unify Customer Experience, User Experience and more to shape lasting customer engagement in a world of rapid change.[/box] 10 tools that will improve your web development workflow 5 things to consider when developing an eCommerce website RESTful Java Web Services Design  
Read more
  • 0
  • 0
  • 2670

article-image-customer-relationship-management-just-got-better-artificial-intelligence
Savia Lobo
28 Jan 2018
8 min read
Save for later

Customer Relationship management just got better with Artificial Intelligence

Savia Lobo
28 Jan 2018
8 min read
According to an International Data Corporation (IDC) report, Artificial intelligence (AI) has the potential to impact many areas of customer relationship management (CRM). AI as an armor will ease out mundane tasks for the CRM teams, which implies they will be able to address more customer queries through an automated approach. An AI-based expert CRM offers highly predictive and intuitive ways to customer problems, thus grabbing maximum customer attention. With AI, CRM platforms within different departments such as sales, finance, marketing etc. do not limit themselves to getting service feedback from their customers. But they can also gain information based on the data that customers generate online i.e the social media or IoT devices. With such massive amount of data hailing from various channels, it becomes a bit tricky for organizations to keep a track of its customers. Not only this, but to extract detailed insights from huge amount of data becomes all the more difficult. And here is the gap where, organizations feel the need to bring in an AI-based optimized approach for their CRM platform. The AI-enabled platform can assist CRM teams to gain insights from the large aggregation of customer data, while also paving a way for seamless customer interactions. Organizations can not only provide customers with helpful suggestions, but also recommend products to boost their business profitability. AI-infused CRM platforms can take over straightforward tasks such as client feedback, that otherwise is time consuming. It allows businesses to focus on customers that provide higher business value, which might have got neglected previously. It also acts as a guide for executive level employees via a virtual assistant, allowing them to tackle customer queries without any assistance from senior executives. AI techniques such as Natural language processing(NLP) and predictive analytics are used within the CRM domain, to gain intelligent insights in order to enhance human decision making. NLP interprets incoming emails, categorizes them on the basis of intent, and automatically drafts responses by identifying the priority level. Predictive Analytics helps in detecting the optimal time for solving customer queries, and the mode of communication that will best fit to engage with the customer. With such functionalities, a smarter move towards digitizing organizational solutions can be achieved  reaping huge profits for organizations who wish to leverage it. How AI is transforming CRM Businesses aim to satisfy customers who utilize their services. This is because, keeping a customer happy can lead to further incrementation in revenue generation. Organizations can achieve this rapidly with the help of AI. Salesforce, the market leader in the CRM space, integrated an AI assistant which is popularly known as Einstein. Einstein makes CRM an easy-to-use platform by simply allowing customers to import their data on Salesforce and automatically provides ready-to-crunch data driven insights across different channels. Other organizations such as SAP and Oracle are implementing AI-based technologies for their CRM platforms to provide an improvised customer experience. Let’s explore how AI benefits within an organization: Steering Sales With AI, the sales team can shift their focus from the mundane administrative tasks and get to know their customers better. Sales CRM team leverages novel scoring techniques, which help in prioritizing quality leads, thus generating maximum revenue for the organization. Sales leaders, with help of AI can work towards improving sales productivity. After analyzing company’s historical data and employee activities, the AI-fused CRM software can present a performance report of the top sales representatives. Such a feature helps sales leaders to strategize what the bottom line representatives should learn from the top representatives to drive conversations with their customers that show a likelihood for sales generation. People.ai, a sales management platform, utilize AI to deliver performance analytics, personalized coaching, and provide reviews for their sales pipeline. This can assist sales leaders get a complete view of sales activities going on within their organizations. Marketing it better To trigger a customer sale requires extensive push marketing strategies.With Artificial Intelligence enabled marketing, customers are driven into a predictive journey, which ensures each journey to end up into a sale or a subscription. Both ways it is a win-win situation for the organizations. Predictive scoring can intelligently determine the likelihood of a customer to subscribe to a newsletter or trigger a purchase. AI can also analyze images across various social media sources such as Pinterest, Facebook, and can provide suggestions for visuals of an upcoming advertising campaign. Also, by carrying out sentiment analysis on product reviews and customer feedback, the marketing team can take into account, user’s sentiment about a particular brand or product. This helps brands to announce discount offers in case of a decreased sale, or increase the production of a product in demand. Marketo, a marketing automation platform includes a software which aids different CRM platforms to gain rich behavioral insights of their customers and to drive business strategies. 24*7 customer support Whenever a customer query arises within a CRM, AI anticipates the likely issues and resolves them before it results into a problem. Different customer cases are classified and directed to the right service agent to address with the help of predictive analytics techniques. Also, NLP-based digital assistants known as chatbots are used to analyze the written content within e-mails. A chatbot efficiently responds to customer e-mails; in most rare cases, it directs the e-mail to a service agent. Chatbots can even notify a customer about an early-bird offer to purchase a product, which they are likely to buy. It can also issue meetings and notify the same by scheduling reminders­‑given the era of push notifications and smart wearables. Hence, with AI into CRM, organizations can not only offer customers better services but also provide 24*7 support. Agent.ai, an AI-based customer service platform, allows organizations to provide a 24*7*365 customer support including holidays, weekends, and non-staffed hours. Application development no more a developer’s play Building an application has become an important milestone to achieve for any organization. If the application has a seamless and user-friendly interface, it is favoured by many customers and thus, the organization gets more customer traction. Building an application was considered as ‘a developers job only’ as it involves coding. However, due to the rise in platforms that help build an application with lesser coding or in fact no-coding, any non-coder can easily develop an application. CRM platforms helps businesses to build applications, which provides insight driven predictions and recommendation solutions to their customers. Salesforce assures their customers that each application built on their platform includes intelligent data modeling, tracking, and monitoring. Business users, data scientists, or any non-developer, can now build applications without learning to code. This helps them to create prediction-based applications their way; without the IT hassle. Challenges & Limitations AI implementations are becoming common with an increased number of organizations adopting it both on a small and a large scale. Many businesses are moving towards a smart customer management by infusing AI within their organizations. AI undoubtedly brings in an ease of work, but there are challenges that the CRM platform can face, which if unaddressed may cause revenue declination for businesses. Below are the challenges which organizations might face while setting up AI in their CRM platform: Outdated data: Organizations collect a huge amount of data during various business activities to drive meaningful insights about sales, customer preferences, etc. This data is a treasure trove for the marketing team, to plan strategies in order to attract more new customers and retain the existing ones. On the contrary, if the data provided is not updated,  CRM teams may find it difficult to understand the current customer relationship status. To avoid this, a comprehensive data cleanup project is essential to maintain better quality of data. Partially automated: AI creates an optimized environment  for the CRM with the use of  predictive analytics and natural language processing for better customer engagement. This eases out the mundane elements for the CRM team, and they can focus on other strategic outcomes. This does not imply that AI is completely replacing humans. Instead, a human touch is required to monitor if the solutions given by the AI benefits the customer and how they can tweak it to a much more smarter AI. Intricacies of language: An AI is trained on data which includes various set of phrases and questions, and also the desired output that it should give. If the query input by the customer is not phrased in a correct manner, the AI is unable to provide them with correct solutions. Hence, customers have to take precautions while asking their queries and phrase it in the correct manner, else the machine would not understand what the customer aims to ask. Infusing AI into CRM has multiple benefits, but the three most important ones include predictive scoring, forecasting, and recommendations. These benefits empower CRM to outsmart its traditional counterpart by helping organizations to serve its customers with state-of-the-art results. Customers appreciate when their query is addressed in lesser time,leaving a positive remark on the organization. Additionally we have digital assistants to assist firms in solving customer query quickly.
Read more
  • 0
  • 0
  • 2670

article-image-how-sports-analytics-is-changing-industry
Amey Varangaonkar
14 Nov 2017
7 min read
Save for later

Of perfect strikes, tackles and touchdowns: how analytics is changing sports

Amey Varangaonkar
14 Nov 2017
7 min read
The rise of Big Data and Analytics is drastically changing the landscape of many businesses - and the sports industry is one of them. In today’s age of cut-throat competition, data-based strategies are slowly taking the front seat when it comes to crucial decision making - helping teams gain that decisive edge over their competition.Sports Analytics is slowly becoming the next big thing! In the past, many believed that the key to conquering the opponent in any professional sport is to make the player or the team better - be it making them stronger, faster, or more intelligent.  ‘Analysis’ then was limited to mere ‘clipboard statistics’ and the intuition built by coaches on the basis of raw video footage of games. This is not the case anymore. From handling media contracts and merchandising to evaluating individual or team performance on matchday, analytics is slowly changing the landscape of sports. The explosion of data in sports The amount and quality of information available to decision-makers within the sports organization have increased exponentially over the last two decades. There are several factors contributing to this: Innovation in sports science over the last decade, which has been incredible, to say the least. In-depth records maintained by trainers, coaches, medical staff, nutritionists and even the sales and marketing departments Improved processing power and lower cost of storage allowing for maintaining large amounts of historical data. Of late, the adoption of motion capture technology and wearable devices has proved to be a real game-changer in sports, where every movement on the field can be tracked and recorded. Today, many teams in a variety of sports such as Boston Red Sox and Houston Astros in Major League Baseball (MLB), San Antonio Spurs in NBA and teams like Arsenal, Manchester City and Liverpool FC in football (soccer) are adopting analytics in different capacities. Turning sports data into insights Needless to say, all the crucial sports data being generated today need equally good analytics techniques to extract the most value out of it. This is where Sports Analytics comes into the picture. Sports analytics is defined as the use of analytics on current as well as historical sport-related data to identify useful patterns, which can be used to gain a competitive advantage on the field of play. There are several techniques and algorithms which fall under the umbrella of Sports Analytics. Machine learning, among them, is a widely used set of techniques that sports analysts use to derive insights. It is a popular form of Artificial Intelligence where systems are trained using large datasets to give reliable predictions on random data. With the help of a variety of classification and recommendation algorithms, analysts are now able to identify patterns within the existing attributes of a player, and how they can be best optimized to improve his performance. Using cross-validation techniques, the machine learning models then ensure there is no degree of bias involved, and the predictions can be generalized even in cases of unknown datasets. Analytics is being put to use by a lot of sports teams today, in many different ways. Here are some key use-cases of sports analytics: Pushing the limit: Optimizing player performance Right from tracking an athlete’s heartbeats per minute to finding injury patterns, analytics can play a crucial role in understanding how an individual performs on the field. With the help of video, wearables and sensor data, it is possible to identify exactly when an athlete’s performance drops and corrective steps can be taken accordingly. It is now possible to assess a player’s physiological and technical attributes and work on specific drills in training to push them to an optimal level. Developing search-powered data intelligence platforms seems to be the way forward. The best example for this is Tellius, a search-based data intelligence tool which allows you to determine a player’s efficiency in terms of fitness and performance through search-powered analytics. Smells like team spirit: Better team and athlete management Analytics also helps the coaches manage their team better. For example, Adidas has developed a system called miCoach which works by having the players use wearables during the games and training sessions. The data obtained from the devices highlights the top performers and the ones who need rest. It is also possible to identify and improve patterns in a team’s playing styles, and developing a ‘system’ to improve the efficiency in gameplay. For individual athletes, real-time stats such as speed, heart rate, and acceleration could help the trainers plan the training and conditioning sessions accordingly. Getting intelligent responses regarding player and team performances and real-time in-game tactics is something that will make the coaches’ and management’s life a lot easier, going forward. All in the game: Improving game-day strategy By analyzing the real-time training data, it is possible to identify the fitter, in-form players to be picked for the game. Not just that, analyzing opposition and picking the right strategy to beat them becomes easier once you have the relevant data insights with you. Different data visualization techniques can be used not just with historical data but also with real-time data, when the game is in progress. Splashing the cash: Boosting merchandising What are fans buying once they’re inside the stadium? Is it the home team’s shirt, or is it their scarfs and posters? What food are they eating in the stadium eateries? By analyzing all this data, retailers and club merchandise stores can store the fan-favorite merchandise and other items in adequate quantities, so that they never run out of stock. Analyzing sales via online portals and e-stores also help the teams identify the countries or areas where the buyers live. This is a good indicator for them to concentrate sales and marketing efforts in those regions. Analytics also plays a key role in product endorsements and sponsorships. Determining which brands to endorse, identifying the best possible sponsor, the ideal duration of sponsorship and the sponsorship fee - these are some key decisions that can be taken by analyzing current trends along with the historical data. Challenges in sports analytics Although the advantages offered by analytics are there for all to see, many sports teams have still not incorporated analytics into their day-to-day operations. Lack of awareness seems to be the biggest factor here. Many teams underestimate or still don’t understand, the power of analytics. Choosing the right Big Data and analytics tool is another challenge. When it comes to the humongous amounts of data, especially, the time investment needed to clean and format the data for effective analysis is problematic and is something many teams aren’t interested in. Another challenge is the rising demand for analytics and a sharp deficit when it comes to supply, driving higher salaries. Add to that the need to have a thorough understanding of the sport to find effective insights from data - and it becomes even more difficult to get the right data experts. What next for sports analytics? Understanding data and how it can be used in sports - to improve performance and maximize profits - is now deemed by many teams to be the key differentiator between success and failure. And it’s not just success that teams are after - it’s sustained success, and analytics goes a long way in helping teams achieve that. Gone are the days when traditional ways of finding insights were enough. Sports have evolved, and teams are now digging harder into data to get that slightest edge over the competition, which can prove to be massive in the long run. If you found the article to be insightful, make sure you check out our interview on sports analytics with ESPN Senior Stats Analyst Gaurav Sundararaman.
Read more
  • 0
  • 2
  • 2665
article-image-five-most-surprising-applications-iot
Raka Mahesa
16 Aug 2017
5 min read
Save for later

Five Most Surprising Applications of IoT

Raka Mahesa
16 Aug 2017
5 min read
The Internet of Things has been growing for quite a while now. The promise of smart and connected gadgets has resulted in many, many applications of Internet of Things. Some of these projects are useful, yet some are not. Some of these applications, like smart TV, smartwatch, and smart home, are expected, whereas others are not. Let's look at a few surprising applications that tap into the Internet of Things.  Let’s get started with a project from Google.   1. Google's Project Jacquard  Simply put, project Jacquard is a smart jacket, a literal piece of clothing that you can wear that is connected to your smartphone. By tapping and swiping on the jacket sleeve, you can control the music player and map application on your smartphone. This project is actually a collaboration between Google and Levi's, where Google invented a fabric that can read touch input and Levi's applied the technology to a product people will actually want to wear.  Even right now, the idea of a fabric that we can interact with boggles my mind. My biggest problem with wearables like smart watch and smart band is that they felt like another device we need to take care of. Meanwhile, a jacket is something that we just wear, with its smart capability being an additional benefit. Not to mention that connected fabric allows more aspects of our daily life to be integrated with our digital life.  That said, project Jacquard is not the first smart clothing, there are other projects like Athos that embeds sensor to their clothing. Still, project Jacquard is the first one that allows people to actually interact with their clothing.  2. Hapifork  Hapifork is actually one of the first smart gadgets that I was aware of. As the name alludes to, Hapifork is a smart fork with capacitive sensor, motion sensor, vibration motor and a micro USB port. You might wonder why a fork needs all those bells and whistles. Well, you see, Hapifork uses those sensors to detect your eating motion and alerts you if you are eating too fast. After all, eating too fast can cause weight gain and other physical issues, so the fork tries to help you live a healthier life.  While the idea has some merits, I'm still not sure an unwieldy smart fork is a good way to make us eat healthier. I think actually eating healthy food is a better way to do that. That said, the idea of smart eating utensils is fascinating. I would totally get a smart plate with the capability of counting the amount of calories in our food.   3. Smart food maker  In 2016 there was a wave of smart food-making devices that started and successfully completed their crowdfunding project. These devices are designed to make it easier and quicker for people to prepare food. They are designed to be much easier than just using a microwave oven, that is. The problem is, these devices are pricey and are only able to prepare a specific type of food. There is CHiP, which can bake various kind of cookies from a set of dough and there is Flatev that can bake tortillas from a pod of dough.  While the concept may initially sound weird, having a specific device to make a specific type of food is actually not that weird. After all, we already have a machine that only makes a cup of fresh coffee, so having a machine that only makes a fresh plate of cookies could be the next natural step.  4. Smart tattoo  Of all the things that can be smart and connected, a tattoo is definitely not the one that comes to my mind. But apparently that's not the case with plenty of researchers from all over the world. There have been a couple of bleeding edge projects that resulted in connected tattoos. L'Oreal has created tattoos that are able to detect ultraviolet exposure, and Microsoft and MIT have created tattoos that users can use to interact with smartphones. And late last year a group of researchers created a tattoo with an accelerometer that can detect a user's heartbeat.  So far wearables have been smart accessories that you wear daily. Since you also wear your skin every day, would it also count as wearable?   5. Oombrella If you ever thought that human isn't a creative creature, just remember that it's also a human who invented the concept of smart umbrella. Oombrella is a connected umbrella that will notify you when it's about to rain and also will notify you if you’ve left it behind in a restaurant. These functionalities may sound passable at first, until you realize that the weather notification comes from your smartphone and you just need a weather app instead of a smart umbrella. That said, this project has been successfully crowdfunded, so maybe people actually want a smart umbrella.  About the author  Raka Mahesa is a game developer at Chocoarts (http://chocoarts.com/), who is interested in digital technology in general. Outside of work hours, he likes to work on his own projects, with Corridoom VR being his latest released game. Raka also regularly tweets as @legacy99
Read more
  • 0
  • 0
  • 2662

article-image-ibm-think-2018-key-takeaways-developers
Amey Varangaonkar
17 Apr 2018
5 min read
Save for later

IBM Think 2018: 6 key takeaways for developers

Amey Varangaonkar
17 Apr 2018
5 min read
This year, IBM Think 2018 was hosted in Las Vegas from March 20 to 22. It was one of the most anticipated IBM events in 2018, with over 40,000 developers as well as technology and business leaders in attendance. Considered IBM’s flagship conference, Think 2018 combined previous conferences such as IBM InterConnect and World of Watson. IBM Think 2018: Key Takeaways IBM Watson Studio announced - A platform where data professionals in different roles can come together and build end-to-end Artificial Intelligence workflows Integration of IBM Watson with Apple's Core ML, for incorporating custom machine learning models into iOS apps IBM Blockchain platform announced, for Blockchain developers to build enterprise-grade decentralized applications Deep Learning as a Service announced as a part of the Watson Studio, allowing you to train deep learning models more efficiently Fabric for Deep Learning open-sourced, so that you can use the open source deep learning framework to train your models and then integrate them with the Watson Studio Neural Network Modeler announced for Watson Studio, a GUI tool to design neural networks efficiently, without a lot of manual coding IBM Watson Assistant announced, an AI-powered digital assistant, for automotive vehicles and hospitality Here are some of the announcements and key takeaways which have excited us, as well as the developers all around the world! IBM Watson Studio announced One of the biggest announcements of the event was the IBM Watson Studio - a premier tool that brings together data scientists, developers and data engineers to collaborate, build and deploy end-to-end data workflows. Right from accessing your data source to deploying accurate and high performance models, this platform does it all. It is just what enterprises need today to leverage Artificial Intelligence in order to accelerate research, and get intuitive insights from their data. IBM Watson Studio's Lead Product Manager, Armand Ruiz, gives a sneak-peek into what we can expect from Watson Studio. Collaboration with Apple Core ML IBM took their relationship with Apple to another level by announcing their collaboration to develop smarter iOS applications. IBM Watson’s Visual Recognition Service can be used to train custom Core ML machine learning models, which can be directly used by iOS apps. The latest announcement at IBM Think 2018 comes as no surprise to us, considering IBM had released new developer tools for enterprise development using the Swift language. IBM Watson Assistant announced IBM Think 2018 also announced the evolution of Watson Conversation to Watson Assistant, introducing new features and capabilities to deliver a more engaging and personalized customer experience. With this, IBM plans to take the concept of AI assistants for businesses on to a new level. Currently in the beta program, there are 2 domain-specific solutions available for use on top of Watson Assistant - namely Watson Assistant for Automotive and Watson Assistant for Hospitality. IBM Blockchain Platform Per Juniper Research, more than half of the world’s big corporations are considering adoption of or are already in the process of adopting Blockchain technology. This presents a serious opportunity for a developer centric platform that can be used to build custom decentralized networks. IBM, unsurprisingly, has identified this opportunity and come up with a Blockchain development platform of their own - the IBM Blockchain Platform. Recently launched as a beta, this platform offers a pay-as-you-use option for Blockchain developers to develop their own enterprise-grade Blockchain solutions without any hassle. Deep Learning as a Service Training a deep learning model is quite tricky, as it requires you to design the right kind of neural networks along with having the right hyperparameters. This is a significant pain point for the data scientists and machine learning engineers. To tackle this problem,  IBM announced the release of Deep Learning as a Service as part of the Watson Studio. It includes the Neural Network Modeler (explained in detail below) to simplify the process of designing and training neural networks. Alternatively, using this service, you can leverage popular deep learning libraries and frameworks such as PyTorch, Tensorflow, Caffe, Keras to train your neural networks manually. In the process, IBM also open sourced the core functionalities of Deep Learning as a Service as a separate project - namely Fabric for Deep Learning. This allows models to be trained using different open source frameworks on Kubernetes containers, and also make use of the GPUs’ processing power. These models can then eventually be integrated to the Watson Studio. Accelerating deep learning with the Neural Network Modeler In a bid to reduce the complexities and the manual work that go into designing and training neural networks, IBM introduced a beta release of the Neural Network Modeler within the Watson Studio. This new feature allows you to design and model standardized neural network models without going into a lot of technical details, thanks to its intuitive GUI. With this announcement, IBM aims to accelerate the overall process of deep learning, so that the data scientists and machine learning developers can focus on the thinking more than operational side of things. At Think 2018, we also saw the IBM Research team present their annual ‘5 in 5’ predictions. This session highlighted the 5 key innovations that are currently in research, and are expected to change our lives in the near future. With these announcements, it’s quite clear that IBM are well in sync with the two hottest trends in the tech space today - namely Artificial Intelligence and Blockchain. They seem to be taking every possible step to ensure they’re right up there as the preferred choice of tool for data scientists and machine learning developers. We only expect the aforementioned services to get better and have more mainstream adoption with time, as most of these services are currently in the beta stage. Not just that, there’s scope for more improvements and addition of newer functionalities as they develop these platforms. What did you think of these announcements by IBM? Do let us know!
Read more
  • 0
  • 0
  • 2655

article-image-promising-devops-projects
Julian Ursell
29 Jan 2015
3 min read
Save for later

Promising DevOps Projects

Julian Ursell
29 Jan 2015
3 min read
The DevOps movement is currently driving a wave of innovations in technology, which are contributing to the development of powerful systems and software development architectures, as well as generating a transformation in “systems thinking”. Game changers like Docker have revolutionized the way system engineers, administrators, and application developers approach their jobs, and there is now a concerted push to iterate on the new paradigms that have emerged. The crystallization of containerization virtualization methods is producing a different perspective on service infrastructures, enabling a greater modularity and fine-grained-ness not so imaginable a decade ago. Powerful configuration management tools such as Chef, Puppet, and Ansible allow for infrastructure to be defined literally as code. The flame of innovation is burning brightly in this space and the concept of the “DevOps engineer” is becoming a reality and not the idealistic myth it appeared to be before. Now that DevOps know roughly where they're going, a feverish development drive is gathering pace with projects looking to build upon the flagship technologies that contributed to the initial spark. The next few years are going to be fascinating in terms of seeing how the DevOps foundations laid down will be built upon moving forward. The major foundation of modern DevOps development is the refinement of the concept and implementation of containerization. Docker has demonstrated how it can be leveraged to host, run, and deploy applications, servers, and services in an incredibly lightweight fashion, abstracting resources by isolating parts of the operating system in separate containers. The sea change in thinking this has created has been resounding. Still, however, a particular challenge for DevOps engineers working at scale with containers is developing effective orchestration services. Enter Kubernetes (apparently meaning “helmsman” in Greek), the project open sourced by Google for the orchestration and management of container clusters. The value of Kubernetes is that it works alongside Docker, building beyond simply booting containers to allow a finer degree of management and monitoring. It utilizes units called “pods” that facilitate communication and data sharing between Docker containers and the grouping of application-specific containers. The Docker project has actually taken the orchestration service Fig under its wing for further development, but there are a myriad of ways in which containers can be orchestrated. Kubernetes illustrates how the wave of DevOps-oriented technologies like Docker are driving large scale companies to open source their own solutions, and contribute to the spirit of open source collaboration that underlines the movement. Other influences of DevOps can be seen on the reappraisal of operating system architectures. CoreOS,for example, is a Linux distribution that has been designed with scale, flexibility, and lightweight resource consumption in mind. It hosts applications as Docker containers, and makes the development of large scale distributed systems easier by making it “natively” clustered, meaning it is adapted naturally for use over multiple machines. Under the hood it offers powerful tools including Fleet (CoreOS' cluster orchestration system) and Etcd for service discovery and information sharing between cluster nodes. A tool to watch out for in the future is Terraform (built by the same team behind Vagrant), which offers at its core the ability to build infrastructures with combined resources from multiple service providers, such as Digital Ocean, AWS, and Heroku, describing this infrastructure as code with an abstracted configuration syntax. It will be fascinating to see whether Terraform catches on and becomes opened up to a greater mass of major service providers. Kubernetes, CoreOS, and Terraform all convey the immense development pull generated by the DevOps movement, and one that looks set to roll on for some time yet.
Read more
  • 0
  • 0
  • 2649
article-image-5-things-that-matter-web-development-2018
Richard Gall
11 Dec 2017
4 min read
Save for later

5 things that will matter in web development in 2018

Richard Gall
11 Dec 2017
4 min read
2017 has been an interesting year in web development. Today the role of a web developer is stretched across the stack - to be a great developer you need to be confident and dexterous with data, and have an eye for design and UX. Yes, all those discrete roles will be important in 2017, but being able to join the pieces of the development workflow together - for maximum efficiency - will be hugely valuable in 2018. What web development tools will matter most in 2018? Find out here. But what will really matter in 2018 in web development? Here's our list of the top 5 things you need to be thinking about… 1. Getting over JavaScript fatigue JavaScript fatigue has been the spectre haunting web development for the last couple of years. But it's effects have been very real - it's exhausting keeping up with the rapidly expanding ecosystem of tools. 'Getting over it', then, won't be easy - and don't think for a minute we're just saying it's time to move on and get real. Instead it's about taking the problem seriously and putting in place strategies to better manage tooling options. This article is a great exploration of JavaScript fatigue and it puts the problem in a rather neat way: JS Fatigue happens when people use tools they don't need to solve problems they don't have. What this means in practical terms, then, is that starting with the problem that you want to solve is going to make life much better in 2018. 2. Web components Web components have been a development that's helping to make the work of web developers that little bit easier. Essentially they're reusable 'bits' that don't require any support from a library (like jQuery for example), which makes front end development much more streamlined. Developments like this hint at a shift in the front end developer skillset - something we'll be watching closely throughout 2018. If components are making development 'easier' there will be an onus on developers to prove themselves in a design and UX sphere. 3. Harnessing artificial intelligence AI has truly taken on wider significance in 2017 and has come to the forefront not only of the tech world's imagination but the wider public one too. It's no longer an academic pursuit. It's not baked into almost everything we do. That means web developers are going to have to get au fait with artificial intelligence. Building more personalized UX is going to be top of the list for many organizations in 2018 - pressure will be on web developers to successfully harness artificial intelligence in innovative ways that drive value for their businesses and clients. 4. Progressive web apps and native-like experiences This builds on the previous two points. But ultimately this is about what user expectations are going to look like in 2018. The demand is going to be for something that is not only personalized (see #3), but something which is secure, fast and intuitive for a user, whatever their specific context. Building successful progressive web apps require a really acute sense of how every moving part is impacting how a user interacts with it - from the way data is utilised to how a UI is built. 2018 is the year where being able to solve and understand problems in a truly holistic way will be vital. 5. Improving the development experience 5. Web development is going to get simultaneously harder and easier - if that makes sense. Web components may speed things up, but you're time will no doubt quickly be filled by something else. This means that in 2018 we need to pay close attention to the development experience. If for example, we're being asked to do new things, deliver products in new ways, we need the tools to be able to do that. If agility and efficiency remain key (which they will of course), unlocking smarter ways of working will be as important as the very things we build. Tools like Docker will undoubtedly help here. In fact, it's worth looking closely at the changing toolchain of DevOps - that's been having an impact throughout 2017 and certainly will in 2018 too.
Read more
  • 0
  • 0
  • 2636

article-image-speed-speed-speed
Owen Roberts
26 Jul 2016
3 min read
Save for later

Speed, Speed, Speed

Owen Roberts
26 Jul 2016
3 min read
We’re currently in the middle of our Skill Up campaign, with all our titles at just $10 each!If you’re late to the party, Skill Up is your opportunity to get the knowledge and skills you need to become the developer you want to be in 2016 and beyond. Along with the launch of our campaign we’ve also released our 2016 Skill Up report; you can download it here if you haven’t done so already, and in the report one particular piece of information really stood out to me - the link between salary and programming language used. Take a look at the graph below: The one thing linking the top earning languages is the speed that each language is able to offer. Whether it’s SQL’s ease of fine tuning or C’s structured design created for simplicity, each is renowned for being faster than their peers and the alternatives. It should be no surprise that faster languages end up offering more pay. Scala’s ability to handle the stress of big data applications and still crunch data fast has made it one of, if not THE biggest programming languages for any big data related issue. Even Perl, a language that has fallen by the wayside in the eyes of many since 2005 is a speed machine, often beating Python 3 when it comes to everyday tasks leading it to carving out its own niche in finance, bioinformatics, and other specialized sectors. The benefits for a company to hire those who can create are obvious – we all know how important it is to customers to have a fast product; those few seconds it takes to load can be the deciding factor as to whether your product is picked up or left in the dust. This is especially true for enterprise level applications or big data crunchers. If the solution you're selling is too slow for the cost you're offering then why should these customers stay with your product when there are so many potentially faster options on the market at a comparable price? So this Skill Up, why not take the opportunity to speed up your applications? Whether it’s by laying the foundations for a completely new language (it’s never too late to change after all) or checking out how to streamline your current apps for better performance and speed, there’s no better time to ensure your programming skills are better than ever.
Read more
  • 0
  • 0
  • 2636