Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
All Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletter Hub
Free Learning
Arrow right icon
timer SALE ENDS IN
0 Days
:
00 Hours
:
00 Minutes
:
00 Seconds

Tech News

3709 Articles
article-image-primeng-8-0-0-releases-with-angular-8-support-focustrap-and-more
Bhagyashree R
14 Jun 2019
2 min read
Save for later

PrimeNG 8.0.0 releases with Angular 8 support, FocusTrap, and more

Bhagyashree R
14 Jun 2019
2 min read
Yesterday, the team behind PrimeNG, a collection of rich UI components for Angular, announced the release of PrimeNG 8.0.0. This release comes with Angular 8.0 support, a new feature called FocusTrap, and other quality improvements. Here are some of the updates in PrimeNG 8.0.0: Compatibility with Angular 8 The main focus behind this release was to support Angular 8. Currently, PrimeNG 8.0.0 does not come with Ivy support as there are various breaking changes for the team to tackle in 8.x. “It is easier to use Ivy although initially there are no significant gains, for library authors such as ourselves there are challenges ahead to fully support Ivy,” the team wrote in the announcement. This compiler is opt-in right now but in the future release, probably in v9, we can expect it to become the default. Currently, there are “no real gains” of using it, however, you can give it a whirl to check whether your app works right with Ivy. You can enable it by adding "enableIvy": true in your angularCompilerOptions, and restart your application. Another issue that you need to keep in mind is Angular 8’s web animations regression that breaks your application if you add import 'web-animations-js'; into polyfills.ts. PrimeNG 8.0.0 users are recommended to use a fork of web-animations until the issue is fixed. Other new features and enhancements A new feature called FocusTrap is introduced, which is a new directive that keeps focus within a certain DOM element while tabbing. Spinner now has the decimalSeperator and thousandSeperator props. A formatInput prop is added to Spinner that formats input numbers according to localSeperators. The FileUpload component uses HttpClient that works with interceptors. This is why the team has removed onBeforeSend and added onSend. Headers prop for FileUpload are introduced to define HttpHeaders for the post request. The ‘rows’ of Table now supports two-way binding. Read more about PrimeNG 8.0 on its official website. Angular 8.0 releases with major updates to framework, Angular Material, and the CLI 5 useful Visual Studio Code extensions for Angular developers Ionic Framework 4.0 has just been released, now backed by Web Components, not Angular
Read more
  • 0
  • 0
  • 3749

article-image-chrome-76-beta-released-with-dark-mode-flash-blocking-by-default-new-pwa-features-and-more
Sugandha Lahoti
14 Jun 2019
3 min read
Save for later

Chrome 76 Beta released with dark mode, flash blocking by default, new PWA features and more

Sugandha Lahoti
14 Jun 2019
3 min read
Yesterday, Google released Chrome 76 beta with number of features which includes blocking Flash by default, a dark mode, and making it harder for sites to detect when you’re using Incognito Mode to get around paywalls. https://twitter.com/GoogleChromeDev/status/1139246837024509952 Blocks Flash by default The Chrome 76 beta by default blocks Flash in the browser. Users still have the option to switch back to the current “Ask first” option in [chrome://settings/content/flash]. Per this option, explicit permission is required for each site after every browser restart. Changes to Payments API Chrome 76 has released a fix in the FilesystemsAPI to address how websites are able to detect if you’re using Incognito to get around a paywall. FileSystem API is updated so that “detect private mode” scripts can no longer take advantage of that indicator. Chrome 76 Beta now also makes it easier to use the payments APIs for self-signed certificates on the local development environment. https://twitter.com/paul_irish/status/1138471166115368960   Additionally, PaymentRequestEvent has a new method called changePaymentMethod() and the PaymentRequest object now supports an event handler called paymentmethodchange. You can use both to notify a merchant when the user changes payment instruments. The former returns a promise that resolves with a new PaymentRequest instance. Improvements for Progressive Web Apps Chrome 76 Beta makes it easier for users to install Progressive Web Apps on the desktop by adding an install button to the omnibox. On mobile, developers can now replace Chrome’s Add to Home Screen mini-infobar with their own prompt. PWAs will also check for updates more frequently starting with Chrome 76 - checking every day, instead of every three days. New Dark mode Chrome 76 Beta also adds the Dark Mode. Websites can now automatically enable dark modes and respect user preference by adding a little bit of extra code in the prefers-color-scheme media query. Other improvements Browsers prevent calls to abusable APIs (like popup, fullscreen, vibrate, etc.) unless the user activates the page through direct interactions. However, not all interactions trigger user activation. Going forward, the escape key is no longer treated as a user activation. Chrome 76 beta introduces a new HTTP request header that sends additional metadata about a request's provenance to the server to allow it to make security decisions. Lazyload feature policy has been removed. This policy was intended to allow developers to selectively control the lazyload attribute on the iframe and img tags to provide more control over loading delay for embedded content and images on a per origin basis. The stable release of Chrome 76 is tentatively scheduled for July 30th. You can read about additional changes on Google’s Chromium blog post. Is it time to ditch Chrome? Ad blocking extensions will now only be for enterprise users. Google Chrome will soon support LazyLoad, a solution to lazily load below-the-fold images and iframes. Mozilla puts “people’s privacy first” in its browser with updates to Enhanced Tracking Protection, Firefox Lockwise and Monitor
Read more
  • 0
  • 0
  • 2461

article-image-google-researcher-reveals-an-unpatched-bug-in-windows-cryptographic-library-that-can-quickly-take-down-a-windows-fleet
Savia Lobo
13 Jun 2019
3 min read
Save for later

Google researcher reveals an unpatched bug in Windows’ cryptographic library that can quickly “take down a windows fleet”

Savia Lobo
13 Jun 2019
3 min read
Tavis Ormandy, a vulnerability researcher at Google, uncovered a security issue in SymCrypt, the core cryptographic library for Windows, which the Microsoft team is still trying to fix. Ormandy says that if the vulnerability is exploited in a denial of service (DoS) attack, it could “take down an entire Windows fleet relatively easily”. Ormandy said that Microsoft had "committed to fixing it in 90 days". This was in line with Google's 90 days deadline for fixing or publicly disclosing bugs that its researchers find. https://twitter.com/taviso/status/1138469651799728128 On Mar 13, 2019, Ormandy informed Microsoft of this vulnerability and also posted this issue on Google’s Project Zero site. On March 26, Microsoft replied saying that it would issue a security bulletin and fix for this in the June 11 Patch Tuesday run. On June 11, Ormandy said that the Microsoft Security Response Center (MSRC) had “reached out and noted that the patch won't ship today and wouldn't be ready until the July release due to issues found in testing”. “There's a bug in the SymCrypt multi-precision arithmetic routines that can cause an infinite loop when calculating the modular inverse on specific bit patterns with bcryptprimitives!SymCryptFdefModInvGeneric”, the bug report mentions. “I've been able to construct an X.509 certificate that triggers the bug. I've found that embedding the certificate in an S/MIME message, authenticode signature, schannel connection, and so on will effectively DoS any windows server (e.g. ipsec, iis, exchange, etc) and (depending on the context) may require the machine to be rebooted. Obviously, lots of software that processes untrusted content (like antivirus) call these routines on untrusted data, and this will cause them to deadlock” Ormandy further added. “The disclosure a day after the deadline lapsed drew mixed reactions on social media, with some criticizing Ormandy for the move; and were met with short shrift”, CBR Online states. https://twitter.com/taviso/status/1138493191793963008 Davey Winder from Forbes approached  The Beer Farmers, a group of information security professionals on this issue. John Opdenakker, an ethical hacker from the group, said, "in general if you privately disclose a vulnerability to a company and the company agrees to fix it within a reasonable period of time I think it's fair to publicly disclose it if they then don't fix it on time." Another Beer Farmer professional, Sean Wright points out this is a denial of service vulnerability and there are many other ways to achieve this, which makes it a low severity issue. Wright said to Forbes, "Personally I think it's a bit harsh, every fix is different and they should allow for some flexibility in their deadline." A Microsoft spokesperson said in a statement to Forbes, “Microsoft has a customer commitment to investigate reported security issues and provide updates as soon as possible. We worked to meet the researcher's deadline for disclosure; however, a customer-impacting regression was discovered that prevented the update from being released on schedule. We advised the researcher of the delay as soon as we were able. Developing a security update is a delicate balance between timeliness and quality, and our ultimate goal is to help ensure maximum customer protection with minimal customer disruption.” To know more about this news in detail, head over to Google’s Project Zero website. All Docker versions are now vulnerable to a symlink race attack Microsoft quietly deleted 10 million faces from MS Celeb, the world’s largest facial recognition database Microsoft releases security updates: a “wormable” threat similar to WannaCry ransomware discovered
Read more
  • 0
  • 0
  • 2181

article-image-vmware-reaches-the-goal-of-using-100-renewable-energy-in-its-operations-a-year-ahead-of-their-2020-vision
Vincy Davis
13 Jun 2019
3 min read
Save for later

VMware reaches the goal of using 100% renewable energy in its operations, a year ahead of their 2020 vision

Vincy Davis
13 Jun 2019
3 min read
Yesterday, VMware announced that they have achieved their goal of achieving 100% renewable energy in their operations, a year ahead of their 2020 vision. VMware has always been optimistic about the power of technology to help solve societal problems. One of their key focus areas has been to change their relationship with energy. https://twitter.com/PGelsinger/status/1138868618257719297 In 2016, VMware had announced their goal to achieve carbon neutral emissions and to advance its commitment to use 100 percent renewable energy by 2020. They have been successful in reaching both these goals, much before the scheduled time. In November 2018, VMware achieved Carbon Neutrality across all their business operations. Now, they have also powered  100 percent of their operations with renewable energy and have joined RE100, a year early. RE100 is a global corporate leadership initiative to commit influential businesses to 100% renewable electricity and accelerate change towards zero carbon energy. RE100 is led by The Climate Group in partnership with CDP and works to increase corporate demand for–and delivery of–renewable energy. As Data centers are responsible for two percent of the world’s greenhouse gas emissions,  VMware’s technologies have helped IT infrastructure become more efficient by fundamentally changing how their customers use power. They have helped their customers avoid putting 540 million metric tons of carbon dioxide into the atmosphere, which is equivalent to powering the population of Spain, Germany and Switzerland for one year. In a blogpost, the Vice President of Sustainability at VMware, Nicola Acutt, has mentioned that they could achieve RE100 through a combination of strategies, such as: Opting into clean power through local utilities Locating assets in areas with renewable energy For areas not feasible, they purchased renewable energy credits (RECs). This indicates demand to the global market of renewable energy, and enables the development of its  infrastructure. According to the U.N.’s report, as a society, around 70-85 percent of electricity will have to be shifted to renewable energy sources by 2050, to avoid the worst impacts of climatic change. Acutt states that to achieve this goal, all establishments will have to acquire a system approach to become more efficient. This will help drive the transition to a sustainable economy globally. The response to this news has been great, people are praising VMware for reaching RE100 ahead of their schedule. https://twitter.com/T180985/status/1139059931695345665 https://twitter.com/songsteven2/status/1138908028714065923 https://twitter.com/RSadorus/status/1138985222815404032 Deep learning models have massive carbon footprints, can photonic chips help reduce power consumption? Now there’s a CycleGAN to visualize the effects of climate change. But is this enough to mobilize action? Responsible tech leadership or climate washing? Microsoft hikes its carbon tax and announces new initiatives to tackle climate change
Read more
  • 0
  • 0
  • 3053

article-image-net-core-3-0-preview-6-is-available-packed-with-updates-to-compiling-assemblies-optimizing-applications-asp-net-core-and-blazor
Amrata Joshi
13 Jun 2019
4 min read
Save for later

.NET Core 3.0 Preview 6 is available, packed with updates to compiling assemblies, optimizing applications ASP.NET Core and Blazor

Amrata Joshi
13 Jun 2019
4 min read
Yesterday, the team at Microsoft announced that .NET Core 3.0 Preview 6 is now available. It includes updates for compiling assemblies for improved startup, optimizing applications for size with linker and EventPipe improvements. The team has also released new Docker images for Alpine on ARM64. Additionally they have made updates to ASP.NET Core and Blazor. The preview comes with new Razor and Blazor directive attributes as well as authentication, authorization support for Blazor apps and much more. https://twitter.com/dotnet/status/1138862091987800064 What’s new in the .NET Core 3.0 Preview 6 Docker images The .NET Core Docker images and repos including microsoft/dotnet and microsoft/dotnet-samples are updated. The Docker images are now available for .NET Core as well as ASP.NET Core on ARM64. Event Pipe enhancements With Preview 6, Event Pipe now supports multiple sessions, users can consume events with EventListener in-proc and have out-of-process event pipe clients. Assembly linking .NET core 3.0 SDK offers a tool that can help in reducing the size of apps by analyzing IL linker and cutting down on unused assemblies. Improving the startup time Users can improve the startup time of their .NET Core application by compiling their application assemblies as ReadyToRun (R2R) format. R2R, a form of ahead-of-time (AOT) compilation is supported with .NET Core 3.0. But it can’t be used with earlier versions of .NET Core. Additional functionality The Native Hosting sample posted by the team lately, demonstrates an approach for hosting .NET Core in a native application. The team is now exposing general functionality to .NET Core native hosts as part of .NET Core 3.0.  The functionality is majorly related to assembly loading that makes it easier to produce native hosts. New Razor features In this release, the team has added support for the new Razor features which are as follows: @attribute This release comes with new @attribute directive that adds specified attribute to the generated class. @code This release comes with new @code directive that is used in .razor files for specifying a code block for adding to the generated class as additional members. @key In .razor files, the new @key directive attribute is used for specifying a value that can be used by the Blazor diffing algorithm to preserve elements or components in a list. @namespace The @namespace directive works in pages and views apps and it is also supported with components (.razor). Blazor directive attributes In this Blazor release, the team has added standardized common syntax for directive attributes on Blazor which makes the Razor syntax used by Blazor more consistent and predictable. Event handlers In Blazor, event handlers now use the new directive attribute syntax than the normal HTML syntax. This new syntax is similar to the HTML syntax, but it has @ character which makes C# event handlers distinct from JS event handlers. Authentication and authorization support With this release, Blazor has a built-in support for handling authentication as well as authorization. The server-side Blazor template also supports the options that are used for enabling the standard authentication configurations with ASP.NET Core Identity, Azure AD, and Azure AD B2C. Certificate and Kerberos authentication to ASP.NET Core Preview 6 comes along with a Certificate and Kerberos authentication to ASP.NET Core. Certificate authentication requires users to configure the server for accepting certificates, and then add the authentication middleware in Startup.Configure and the certificate authentication service in Startup.ConfigureServices. Users are happy with this news and they think the updates will be useful. https://twitter.com/gcaughey/status/1138889676192997380 https://twitter.com/dodyg/status/1138897171636531200 https://twitter.com/acemod13/status/1138907195523907584 To know more about this news, check out the official blog post. .NET Core releases May 2019 updates An introduction to TypeScript types for ASP.NET core [Tutorial] What to expect in ASP.NET Core 3.0
Read more
  • 0
  • 0
  • 3129

article-image-introducing-voila-that-turns-your-jupyter-notebooks-to-standalone-web-applications
Bhagyashree R
13 Jun 2019
3 min read
Save for later

Introducing Voila that turns your Jupyter notebooks to standalone web applications

Bhagyashree R
13 Jun 2019
3 min read
Last week, a Jupyter Community Workshop on dashboarding was held in Paris. At the workshop, several contributors came together to build the Voila package, the details of which QuantStack shared yesterday. Voila serves live Jupyter notebooks as standalone web applications providing a neat way to share your work results with colleagues. Why do we need Voila? Jupyter notebooks allow you to do something called “literature programming” in which human-friendly explanations are accompanied with code blocks. It allows scientists, researchers, and other practitioners of scientific computing to add theory behind their code including mathematical equations. However, Jupyter notebooks may prove to be a little bit problematic when you plan to communicate your results with other non-technical stakeholders. They might get put-off by the code blocks and also the need for running the notebook to see the results. It also does not have any mechanism to prevent arbitrary code execution by the end user. How Voila works? Voila addresses all the aforementioned queries by converting your Jupyter notebook to a standalone web application. After connecting to a notebook URL, Voila launches the kernel for that notebook and runs all the cells. Once the execution is complete, it does not shut down the kernel. The notebook gets converted to HTML and is served to the user. This rendered HTML includes JavaScript that is responsible for initiating a websocket connection with the Jupyter kernel. Here’s a diagram depicting how it works: Source: Jupyter Blog Following are the features Voila provides: Renders Jupyter interactive widgets: It supports Jupyter widget libraries including bqplot, ipyleafet, ipyvolume, ipympl, ipysheet, plotly, and ipywebrtc. Prevents arbitrary code execution: It does not allow arbitrary code execution by consumers of dashboards. A language-agnostic dashboarding system: Voila is built upon Jupyter standard protocols and file formats enabling it to work with any Jupyter kernel (C++, Python, Julia). Includes custom template system for better extensibility: It provides a flexible template system to produce rich application layouts. Many Twitter users applauded this new way of creating live and interactive dashboards from Jupyter notebooks: https://twitter.com/philsheard/status/1138745404772818944 https://twitter.com/andfanilo/status/1138835776828071936 https://twitter.com/ToluwaniJohnson/status/1138866411261124608 Some users also compared it with another dashboarding solution called Panel. The main difference between Panel and Voila is that Panel supports Bokeh widgets whereas Voila is framework and language agnostic. “Panel can use a Bokeh server but does not require it; it is equally happy communicating over Bokeh Server's or Jupyter's communication channels. Panel doesn't currently support using ipywidgets, nor does Voila currently support Bokeh plots or widgets, but the maintainers of both Panel and Voila have recently worked out mechanisms for using Panel or Bokeh objects in ipywidgets or using ipywidgets in Panels, which should be ready soon,” a Hacker News user commented. To read more in detail about Voila, check out the official announcement on the Jupyter Blog. JupyterHub 1.0 releases with named servers, support for TLS encryption and more Introducing Jupytext: Jupyter notebooks as Markdown documents, Julia, Python or R scripts JupyterLab v0.32.0 releases
Read more
  • 0
  • 0
  • 10915
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at ₹800/month. Cancel anytime
article-image-dropbox-gets-a-major-overhaul-with-updated-desktop-app-new-slack-and-zoom-integration
Sugandha Lahoti
13 Jun 2019
3 min read
Save for later

Dropbox gets a major overhaul with updated desktop app, new Slack and Zoom integration

Sugandha Lahoti
13 Jun 2019
3 min read
Dropbox has revamped the traditional cloud storage service and announced a new unified version of its desktop app, the company is calling “the new Dropbox.” This new version would be a single workplace solution to help you organize content, connect tools, and bring work groups together unifying productivity tools such as Google Docs, Microsoft Office, Slack, Salesforce, Trello, and Zoom. Dropbox is becoming a task-management app The new version of the popular file sharing service wants to be your file tree, your finder and your desktop for the cloud. Users can create and store shortcuts to any online project management and productivity tools alongside their content. It has a unified search bar that lets you crawl across your computer’s file system and all your cloud storage across other productivity apps. Users can add descriptions to folders to help the team understand more about the work they’re doing. Key content can be highlighted by pinning it to the top of a workspace, and users can @mention people and assign to-dos. Users can see file activity and keep tabs with a new team activity feed. There’s also a “Send feedback” button in the lower-right side of the page to talk about how the update is working (or not working) for you in practice. Search bar New third party integrations: Slack and Zoom Dropbox now integrates with Slack for seamless collaboration between content and communication. Users can start Slack conversations and share content to Slack channels directly from Dropbox. Slack integration with dropbox Users can also video conference with Zoom by connecting Zoom and calendar to Dropbox. From Dropbox, they can add and join Zoom Meetings where they can share files from their Dropbox. The new Dropbox has got users quite excited. https://twitter.com/jsnell/status/1138847481238712320 https://twitter.com/sdw/status/1138518725571665920   Some others have commented that the new dropbox is massive in size. https://twitter.com/puls/status/1138561011684859905 https://twitter.com/sandofsky/status/1138686582859239425   However, some pointed out that the new file sharing service lacked privacy protections. Obviously, if it integrates with other productivity tools, there should be a mechanism to keep user data private. https://twitter.com/TarikTech/status/1139068388964261888 The new file sharing service was launched on Tuesday for all of its 13 million business users across 400,000 teams plus its consumer tiers. Users can opt-in for early access and businesses can turn on early access in their admin panel. Dropbox purchases workflow and eSignature startup ‘HelloSign’ for $250M How Dropbox uses automated data center operations to reduce server outage and downtime Zoom, the video conferencing company files to go public, possibly a profitable IPO
Read more
  • 0
  • 0
  • 2140

article-image-have-i-been-pwned-up-for-acquisition-troy-hunt-code-names-this-campaign-project-svalbard
Savia Lobo
12 Jun 2019
4 min read
Save for later

‘Have I Been Pwned’ up for acquisition; Troy Hunt code names this campaign ‘Project Svalbard’

Savia Lobo
12 Jun 2019
4 min read
Yesterday, Troy Hunt, revealed that his ‘Have I Been Pwned’(HIBP) website is up for sale, on his blogpost. Hunt has codenamed this acquisition as Project Svalbard and is working with KPMG to find a buyer. [box type="shadow" align="" class="" width=""]Troy Hunt has named Project Svalbard after the Svalbard Global Seed Vault, which is a secure seed bank on the Norwegian island of Spitsbergen. This vault represents the world’s largest collection of crop diversity with a long-term seed storage facility, for worst-case scenarios such as natural or man-made disasters.[/box] Commercial subscribers highly depend on HIBP to alert members of identity theft programs, enable infosec companies, provide services to their customers, protect large online assets from credential stuffing attacks, preventing fraudulent financial transactions and much more. Also,  governments around the world and the law enforcement agencies use HIBP to protect their departments and also for their investigations respectively. Hunt further says he has been handling everything alone. “to date, every line of code, every configuration and every breached record has been handled by me alone. There is no “HIBP team”, there’s one guy keeping the whole thing afloat”, he writes. However, in January, this year he discovered Collection #1 data breach which included 87 GB worth of data in a folder containing 12,000-plus files, nearly 773 email addresses, and more than 21 million unique passwords from data breaches going back to 2008. Hunt uploaded all of these breached data to HIBP and since then he says the site has seen a massive influx in activity, thus, taking him away from other responsibilities. “The extra attention HIBP started getting in Jan never returned to 2018 levels, it just kept growing and growing,” he says. Hunt said he was concerned about burnout, given the increasing scale and incidence of data breaches. Following this, he said it was time for HIBP to “grow up”. He also believed HIBP could do more in the space, including widening its capture of breaches. https://twitter.com/troyhunt/status/1138322112224083968 “There's a whole heap of organizations out there that don't know they've been breached simply because I haven't had the bandwidth to deal with it all,” Hunt said. “There's a heap of things I want to do with HIBP which I simply couldn't do on my own. This is a project with enormous potential beyond what it's already achieved and I want to be the guy driving that forward,” Hunt wrote. Hunt also includes a list of “commitments for the future of HIBP” in his blogpost. He also said he intended to be “part of the acquisition - that is some company gets me along with the project” and that “freely available consumer searches should remain freely available”. Via Project Svalbard, Hunt hopes to enable HIBP to reach out to more and more people and play “a much bigger role in changing the behavior of how people manage their online accounts.” A couple of commenters on the blog post ask Hunt whether he’s considered/approached Mozilla as a potential owner. In a reply to one he writes,“Being a party that’s already dependent on HIBP, I reached out to them in advance of this blog post and have spoken with them. I can’t go into more detail than that just now, but certainly their use of the service is enormously important to me.” To know more about this announcement in detail, read Troy Hunt’s official blogpost. A security researcher reveals his discovery on 800+ Million leaked Emails available online The Collections #2-5 leak of 2.2 billion email addresses might have your information, German news site, Heise reports Bo Weaver on Cloud security, skills gap, and software development in 2019
Read more
  • 0
  • 0
  • 2746

article-image-mark-zuckerberg-just-became-the-target-of-the-worlds-first-high-profile-white-hat-deepfake-op-can-facebook-come-out-unscathed
Vincy Davis
12 Jun 2019
6 min read
Save for later

Zuckerberg just became the target of the world's first high profile white hat deepfake op. Can Facebook come out unscathed?

Vincy Davis
12 Jun 2019
6 min read
Yesterday, Motherboard reported that a fake video of Mark Zuckerberg was posted on Instagram, under the username, bill_posters_uk. In the video, Zuckerberg appears to give a threatening speech about the power of Facebook. https://twitter.com/motherboard/status/1138536366969688064 Motherboard mentions that the video has been created by artists Bill Posters and Daniel Howe in partnership with advertising company Canny. Previously, Canny in partnership with Posters  has created several such deepfake videos of Donald Trump, Kim Kardashian etc. Omer Ben-Ami, one of the founders of Canny says that the video is made to educate the public on the uses of AI and to make them realize the potential of AI. But according to other news sources the video created by the artists is to test Facebook’s no takedown policy on fake videos and misinformation for the sake of retaining their “educational value”. Recently Facebook received strong criticism for promoting fake videos on its platform. In May, the company had refused to remove a doctored video of senior politician Nancy Pelosi. Neil Potts, Public Policy Director of Facebook had stated that if someone posted a doctored video of Zuckerberg, like one of Pelosi, it would stay up. Around the same time, Monika Bickert, vice president for Product Policy And Counterterrorism at Facebook had said for the fake video of Nancy Pelosi,“Anybody who is seeing this video in News Feed, anyone who is going to share it to somebody else, anybody who has shared it in the past, they are being alerted that this video is false”. Bickert also added that, “And this is part of the way that we deal with misinformation.” Following all of this it seems that the stance on Mark Zuckerberg’s fake video went through a test and it passed. As Instagram spokesperson comments that it will stay up on the platform but will be removed from recommendation surface. “We will treat this content the same way we treat all misinformation on Instagram,” a spokesperson for Instagram told Motherboard. “If third-party fact-checkers mark it as false, we will filter it from Instagram’s recommendation surfaces like Explore and hashtag pages.” The fake Mark Zuckerberg video is a short one in which he talks about Facebook’s power, the video says, “Imagine this for a second: One man, with total control of billions of people's stolen data, all their secrets, their lives, their futures, I owe it all to Spectre. Spectre showed me that whoever controls the data, controls the future.” The video is also framed with broadcast chyrons which reads, “Zuckerberg: We're increasing transparency on ads. Announces new measures to protect elections.” This was created in order to make it appear like a usual news report. [box type="shadow" align="" class="" width=""]As the video is fake and unauthentic, we have not added link to the video in our article.[/box] The audio in the video sounds much like a voiceover, but without any sync issues and is loud and clear. However the visuals are almost accurate. In this deepfake video, the person shown can blink, move seamlessly and also gesture the way Zuckerberg would do. Motherboard reports that the visuals in the video are taken from a real video of Zuckerberg in September 2017, when he was addressing the Russian election interference on Facebook. The Instagram post containing the video, stated that it’s created using CannyAI's video dialogue replacement (VDR) technology. In a statement to Motherboard, Omer Ben-Ami, said that for the Mark Zuckerberg deepfake, “Canny engineers arbitrarily clipped a 21-second segment out of the original seven minute video, trained the algorithm on this clip as well as videos of the voice actor speaking, and then reconstructed the frames in Zuckerberg's video to match the facial movements of the voice actor.” Omer also mentions that “the potential of AI lies in the ability of creating a photorealistic model of a human being. It is the next step in our digital evolution where eventually each one of us could have a digital copy, a Universal Everlasting human. This will change the way we share and tell stories, remember our loved ones and create content” A CNN reporter has tweeted that CBS is asking Facebook to remove the fake Zuckerberg video because it shows the CBS logo on it, “CBS has requested that Facebook take down this fake, unauthorized use of the CBSN trademark”. Apparently the fake video of Zuckerberg has garnered some good laughs among the community. It is also seen as a next wave in the battle to fight misinformation on social media sites. A user on Hacker News says, “I love the concept of this. There's no better way to put Facebook's policy to the test than to turn it against them.” https://twitter.com/jason_koebler/status/1138515287853228032 https://twitter.com/ezass/status/1138592610363174913 But many users are also concerned that if a fake video can look so accurate now, it’s going to be a challenge to identify which information is true and which is false. A user on Reddit comments that “This election cycle will be a dry run for the future. Small ads, little bits of phrases and speeches will stream across social media. If it takes hold, I fear for the future. We will find it very, very difficult to know what is real without a large social change, as large as the advent of social media in the first place.” Another user adds “I'm routinely surprised by the number of people unaware just how far this technology has progressed just in the past three years, as well as how many people are completely unaware it exists at all. At this point, I think that's scarier than the tech itself.” And another one comments, “True. Also the older generation. I can already see my grandpa seeing a deepfake on Fox news and immediately considering it gospel without looking into it further.” US regulators plan to probe Google on anti-trust issues; Facebook, Amazon & Apple also under legal scrutiny Facebook argues it didn’t violate users’ privacy rights and thinks there’s no expectation of privacy because there is no privacy on social media Google and Facebook allegedly pressured and “arm-wrestled” EU expert group to soften European guidelines for fake news: Open Democracy Report
Read more
  • 0
  • 0
  • 2420

article-image-amazon-announces-general-availability-of-amazon-personalize-an-ai-based-recommendation-service
Vincy Davis
12 Jun 2019
3 min read
Save for later

Amazon announces general availability of Amazon Personalize, an AI-based recommendation service

Vincy Davis
12 Jun 2019
3 min read
Two days ago, Amazon Web Services (AWS) announced in a press release that Amazon Personalize will now be generally available to all customers. Until now, this machine learning technology was used by Amazon.com for AWS customers to use in their applications. https://twitter.com/jeffbarr/status/1138430113589022721 Amazon Personalize basically helps developers easily add custom machine learning models into their applications, such as personalized product and content recommendations, tailored search results, and targeted marketing promotions, even if they don’t have machine learning experience. It is a fully managed service that trains, tunes, and deploys custom, private machine learning models. Customers have to pay for only what they use, with no minimum fees or upfront commitments. Amazon has been using Personalize for processing and examining the data, to identify what is meaningful, to select from multiple built advanced algorithms for their retail business, and for training and optimizing a personalization model customized to their data. All this is done, while keeping the customers data completely private. Customers receive results via an Application Programming Interface (API). Now,with the general availability of Amazon Personalize, many application developers and data scientists at businesses of all sizes across all industries, will be able to use and implement the power of Amazon’s expertise in machine learning. Swami Sivasubramanian, Vice President of Machine Learning, Amazon Web Services said “Customers have been asking for Amazon Personalize, and we are eager to see how they implement these services to delight their own end users. And the best part is that these artificial intelligence services, like Amazon Personalize, do not require any machine learning experience to immediately train, tune, and deploy models to meet their business demands”. Amazon Personalize will now be available in US East (Ohio), US East (N. Virginia), US West (Oregon), Asia Pacific (Tokyo), Asia Pacific (Singapore) and EU (Ireland). Amazon charges five cents per GB of data uploaded to Personalize and 24 cents per training hour used to train a custom model. Real-time recommendation requests are priced based on how many requests are uploaded, with discounts for larger orders. Customers who have already added Amazon Personalize to their apps include Yamaha Corporation of America, Subway, Zola and Segment. In the press release, Ishwar Bharbhari, Director of Information Technology, Yamaha Corporation of America, said “Amazon Personalize saves us up to 60% of the time needed to set up and tune the infrastructure and algorithms for our machine learning models when compared to building and configuring the environment on our own. It is ideal for both small developer teams who are trying to build the case for ML and large teams who are trying to iterate rapidly at reasonable cost. Even better, we expect Amazon Personalize to be more accurate than other recommender systems, allowing us to delight our customers with highly personalized product suggestions during their shopping experience, which we believe will increase our average order value and the total number of orders”. Developers are of course excited, that they can finally implement Amazon Personalize in their applications. https://twitter.com/TheNickWalsh/status/1138243004127334400 https://twitter.com/SubkrishnaRao/status/1138742140996112384 https://twitter.com/PatrickMoorhead/status/1138228634924212229 To get started with Amazon Personalize, head over to this blog post by Julien Simon. Amazon re:MARS Day 1 kicks off showcasing Amazon’s next-gen AI robots; Spot, the robo-dog and a guest appearance from ‘Iron Man’ US regulators plan to probe Google on anti-trust issues; Facebook, Amazon & Apple also under legal scrutiny World’s first touch-transmitting telerobotic hand debuts at Amazon re:MARS tech showcase
Read more
  • 0
  • 0
  • 2033
article-image-untangle-releases-zseries-appliances-and-ng-firewall-v14-2-for-enhanced-network-security-framework
Amrata Joshi
12 Jun 2019
2 min read
Save for later

Untangle releases zSeries appliances and NG Firewall v14.2 for enhanced Network Security Framework

Amrata Joshi
12 Jun 2019
2 min read
Yesterday, Untangle, a company that provides network security for SMBs (Small and Midsize Businesses) and distributed enterprises announced the release of its zSeries appliances. The zSeries appliances will provide better performance and functionality at a lower price for SMBs as well as distributed enterprises with cloud-managed next-generation firewalls. The zSeries includes five appliances, right from small desktop models to 1U rackmount servers, as well as a wireless option. All these appliances will be preloaded with NG Firewall 14.2 version, it is Untangle’s network security software product that makes deployment easy. The zSeries appliances are now available on the Untangle website for purchase. Heather Paunet, vice president of product management at Untangle said, “The zSeries offers a simplified lineup to suit customers from branch offices to large campuses. Key upgrades available with the zSeries include faster processors, more RAM, NVMe SSD storage on the z6 and above, and fiber connectivity on the z12 and above.” She further added, “It’s never been easier to deploy cost-effective, cloud-managed network security across dispersed networks while ensuring a consistent security posture for organizations of any size.” NG Firewall v14.2 packed with enhancements to web security and content filtering Untangle NG Firewall 14.2 comes with enhancements to web security and content filtering. It also offers the ability to synchronize users with Azure Active Directory as well as bring enhancements to intrusion detection and prevention. NG Firewall has won 2019 Security Today Government Security Awards “The Govies” for Network Security. NG Firewall v14.2 offers options for Flagging, blocking and alerting based on search terms for YouTube, Yahoo, Google, Bing, and Ask. With this firewall, YouTube searches can now be easily logged, and usage can also be locked down to show content that meets the 'safe search' criteria. Untangle NG Firewall 14.2 is available as a free upgrade for existing customers. Join Untangle for the Community Webinar: zSeries and NG Firewall v14.2 on June 18, 2019 to learn more about the features in 14.2 and the new zSeries appliances. To know more about this news, check out press release. Untangle VPN Services PyPI announces 2FA for securing Python package downloads All Docker versions are now vulnerable to a symlink race attack  
Read more
  • 0
  • 0
  • 2746

article-image-mariadb-announces-the-release-of-mariadb-enterprise-server-10-4
Amrata Joshi
12 Jun 2019
4 min read
Save for later

MariaDB announces the release of MariaDB Enterprise Server 10.4

Amrata Joshi
12 Jun 2019
4 min read
Yesterday, the team at MariaDB announced the release of MariaDB Enterprise Server 10.4, which is code named as “restful nights”. It is a hardened and secured server which is also different from MariaDB’s Community Server. This release is focused on solving enterprise customer needs, offering them greater reliability, stability as well as long-term support in production environments. MariaDB Enterprise Server 10.4 and its backported versions will be available to customers by the end of the month as part of the MariaDB Platform subscription. https://twitter.com/mariadb/status/1138737719553798144 The official blog post reads, “For the past couple of years, we have been collaborating very closely with some of our large enterprise customers. From that collaboration, it has become clear that their needs differ vastly from that of the average community user. Not only do they have different requirements on quality and robustness, they also have different requirements for features to support production environments. That’s why we decided to invest heavily into creating a MariaDB Enterprise Server, to address the needs of our customers with mission critical production workloads.” MariaDB Enterprise Server 10.4 comes with added functionality for enterprises that are running MariaDB at scale in production environments. It also involves new levels of testing and is shipped in by default secure configuration. It also includes the same features of MariaDB Server 10.4, including bitemporal tables, an expanded set of instant schema changes and a number of improvements to authentication and authorization (e.g., password expiration and automatic/manual account locking) Max Mether, VP of Server Product Management, MariaDB Corporation, wrote in an email to us, “The new version of MariaDB Server is a hardened database that transforms open source into enterprise open source.” He further added, “We worked closely with our customers to add the features and quality they need to run in the most demanding production environments out-of-the-box. With MariaDB Enterprise Server, we’re focused on top-notch quality, comprehensive security, fast bug fixes and features that let our customers run at internet-scale performance without downtime.” James Curtis, Senior Analyst, Data Platforms and Analytics, 451 Research, said, “MariaDB has maintained a solid place in the database landscape during the past few years.” He added, “The company is taking steps to build on this foundation and expand its market presence with the introduction of MariaDB Enterprise Server, an open source, enterprise-grade offering targeted at enterprise clients anxious to stand up production-grade MariaDB environments.” Reliability and stability MariaDB Enterprise Server 10.4 offers reliability and stability that is required for production environments. In this server, even bugs are fixed that further help in maintaining reliability. The key enterprise features are backported for the ones running earlier versions of MariaDB Server, and provide long-term support. Security Unsecured databases are most of the times the reason for data breaches. But the MariaDB Enterprise Server 10.4is configured with security settings to support enterprise applications. All non-GA plugins will be disabled by default in order to reduce the risks incurred when using unsupported features. Further, the default configuration is changed to enforce strong security, durability and consistency. Enterprise backup MariaDB Enterprise Server 10.4 offers enterprise backup that brings operational efficiency to customers with large databases and further breaks up the backups into non-blocking stages. So this way, writes and schema changes can occur during backups than waiting for backup to complete. Auditing capabilities This server adds secure, stronger and easier auditing capabilities by logging all changes to the audit configuration. It also logs detailed connection information that gives the customers a comprehensive view of changes made to the database. End-to-end encryption It also offers end-to-end encryption for multi-master clusters where the transaction buffers are encrypted that ensure that the data is secure. https://twitter.com/holgermu/status/1138511727610478594 Learn more about this news on the official web page. MariaDB CEO says big proprietary cloud vendors “strip-mining open-source technologies and companies” MariaDB announces MariaDB Enterprise Server and welcomes Amazon’s Mark Porter as an advisor to the board of directors TiDB open sources its MySQL/MariaDB compatible data migration (DM) tool
Read more
  • 0
  • 0
  • 2716

article-image-youtube-ceo-susan-wojcicki-says-reviewing-content-before-upload-isnt-a-good-idea
Fatema Patrawala
12 Jun 2019
8 min read
Save for later

YouTube CEO, Susan Wojcicki says reviewing content before upload isn’t a good idea

Fatema Patrawala
12 Jun 2019
8 min read
This year’s Recode annual Code Conference commenced on Monday, 10th June at Arizona. YouTube CEO, Susan Wojcicki was interviewed yesterday by Peter Kafka, Recode’s senior correspondent and there were some interesting conversations between the two. https://twitter.com/pkafka/status/1137815486387802112 The event is held for two days covering interviews from the biggest names in the tech business. The talks will include the sessions from the below representatives: YouTube CEO Susan Wojcicki Facebook executives Adam Mosseri and Andrew Bosworth Amazon Web Services CEO Andy Jassy Fair Fight founder Stacey Abrams Netflix vice president of original content Cindy Holland Russian Doll star Natasha Lyonne Medium CEO Ev Williams Harley Davidson President and CEO, Matthew Levatich The Uninhabitable Earth author David Wallace Wells This year as Code brings the most powerful people in tech, it also defines the manifesto for the event - as Reckoning. Reckoning means “the avenging or punishing of past mistakes or misdeeds.” As it believes that this year there is no word better which captures the state of mind of those in Big tech today. With some of the incalculable mistakes which these companies have gotten themselves into to build up the internet. This is the focus of this years Code Con 2019. In one of the biggest takeaways from Susan Wojcicki’s interview, she said she is okay with taking content down, but she doesn’t think it’s a good idea to review it before it goes up on the massive video-sharing platform. “I think we would lose a lot of voices,” Wojcicki said. “I don’t think that’s the right answer.” Peter asked her about the updated hate speech policy which YouTube announced last week. Susan in response said it was a very important decision for YouTube and they had been working for it since months. The company updated its hate speech policy according to which it will take down “videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status.” The policy directly mentioned removing videos that promote neo-Nazi content or videos that deny commonly accepted violent events, like the Holocaust or the Sandy Hook school shooting. The conversation also went in the direction of last week’s incidence, when YouTube decided that Steven Crowder wasn’t violating its rules when he kept posting videos with homophobic slurs directed at Vox journalist Carlos Maza, and the company eventually demonetized Crowder’s channel. Kafka pointed out that the company is making such decisions, but only after content is online on YouTube’s platform. In response, Wojcicki emphasized the importance of reviewing content after it publishes on the site. “We see all these benefits of openness, but we also see that that needs to be married with responsibility,” she said. She also added that the decision that Crowder’s videos did not violate the policy was hurtful to the LGBTQ community. That was not our intention and we are really sorry about it. However, she did not commit any further action in this case. https://twitter.com/voxdotcom/status/1138236928682336256 Wojcicki admitted that there will likely always be content on YouTube that violates its policies. “At the scale that we’re at, there are always gonna be people who want to write stories,” she said, suggesting that journalists will always choose to focus on the negative aspects of YouTube in their reporting. “We have lots of content that’s uploaded and lots of users and lots of really good content. When we look at it, what all the news and the concerns and stories have been about is this fractional 1 percent,” Wojcicki said. “If you talk about what the other 99 points whatever that number is that’s all really valuable content.” “Yes, while there may be something that slips through or some issue, we’re really working hard to address this,” she said. https://twitter.com/Recode/status/1138229101834133510 Wojcicki suggested that instead of approving videos ahead of time, using tiers in which creators get certain privileges over time. This means more distribution and monetization of their content. “I think this idea of like not everything is automatically given to you on day one, that it’s more of a — we have trusted tiers,” she said. Wojcicki then discussed about YouTube limiting recommendations, comments, and sharing, and it has reduced views of white supremacist videos by 80 percent since 2017. She mentioned that it has only now banned that content altogether. And YouTube is one of several prominent tech companies trying to figure out how to deal with hateful content proliferating on their platforms. Later Wojcicki shifted the focus of conversation on the improvements YouTube has made in the past few years. “Two years ago there were a lot of articles, a lot of concerns about how we handle violent extremism. If you talk to people who are experts in this field, you can see that we’ve made tremendous progress.” “We have a lot of tools, we work hard to understand what is happening on it and really work hard to enforce the work that we’re doing. I think if you look across the work you can see we’ve made tremendous progress in a number of these areas,” Wojcicki said. “If you were to fast-forward a couple years and say, well, what that would look like in 12 months and then in another 12 months, what are all the different tools that have been built, I think you’ll see there will be a lot of progress.” There were questions from the audience as well and one of them asked Wojcicki, “You started off with an apology to LGBT community but then you also said that you were involved in it and you think YouTube made the right call as to why people don't feel like that's an apology and concerned that YouTube flags LGBT and positive content just for being LGBT and sometimes being sensitive and yet slurs are allowed so I am curious to know if you really sorry for anything that happened to the LGBTQ community? Or are you just sorry they were offended?" Susan said she is personally really sorry for what happened and speaking up for the company they had not intended to do that and the company is also sorry for the hurt that had caused. Susan then continued to give a vague response and did not really answer the question of why LGBT positive content get flagged as sensitive but slurs against the LGBT community is allowed on Youtube. She reiterated the policy and tried to play around by saying we have people who belong to that community and we support them, and that it was a “hard” decision. “So I am really personally really sorry and that was not our intent. YouTube has always been the home of so many LGBTQ creators and that was why it was so emotional and that’s why I think this really even though it was a hard decision it was made harder  for us and YouTube has so many people from the LGBTQ and we have always wanted to support the community openly in spite of this hard issue that we have had right now and people had this criticism on why you still and why you changed your logo to rainbows even though you made this hard decision and because as a company we really wanna support the community its just that from a policy standpoint we need to be consistent because if we took down that content there would be so much other content that we would need to take down and we don't want just to be knee jerk and we need to think about it in a very thoughtful way. We will speak to people from LGBTQ community and make sure that we are incorporating that going forward in terms of how think about harassment and make sure that we are implementing that in a fair and consistent way going forward. And I think it was a hard week and I am truly sorry for the hurt that we have caused to the community. It was not our intention at all and I do want to say that many changes we made to the hate policy are really going to be beneficial to the community. There are a lot of videos there and there are a lot of ways the community is attacked and we will be taking down those videos going forward and we will be very consistent and if we see such videos we will take them down” https://twitter.com/Recode/status/1138231255445598208 The community is also unhappy with her response and says this is not the answer to the question and she has tried to deny the responsibility. https://twitter.com/VickerySec/status/1138657458182770688 https://twitter.com/Aleen/status/1138246902539886592 Facebook argues it didn’t violate users’ privacy rights and thinks there’s no expectation of privacy because there is no privacy on social media Time for data privacy: DuckDuckGo CEO Gabe Weinberg in an interview with Kara Swisher Privacy Experts discuss GDPR, its impact, and its future on Beth Kindig’s Tech Lightning Rounds Podcast
Read more
  • 0
  • 0
  • 1980
article-image-ngrx-8-released-with-ngrx-data-creator-functions-mock-selectors-for-isolated-unit-testing-and-more
Bhagyashree R
12 Jun 2019
2 min read
Save for later

NgRx 8 released with NgRx Data, creator functions, mock selectors for isolated unit testing, and more!

Bhagyashree R
12 Jun 2019
2 min read
On Monday, the team behind NgRx, a platform that provides reactive libraries for Angular, announced the release of NgRx 8. This release includes the NgRx Data package, creator functions, four run-time checks, mock selectors, and much more. Following are some of the updates in NgRx 8: NgRx Data integrated into the NgRx platform In this release, the team has integrated the angular-ngrx-data library by John Papa and Ward Bell directly into the NgRx platform as a first-party package. Using NgRx in your Angular applications properly requires a deeper understanding and a lot of boilerplate code. This package gives you a “gentle introduction” to NgRx without the boilerplate code and simplifies entity data management. Redesigned creator functions NgRx 8 comes with two new creator functions: createAction: Previously, while creating an action you had to create an action type, create a class, and lastly, create an action union. The new createAction function allows you to create actions in a less verbose way. createReducer: With this function, you will be able to create a reducer without a switch statement. It takes the initial state as the first parameter and any number of ‘on’ functions. Four new runtime checks To help developers better follow the NgRx core concepts and best practices, this release comes with four runtime checks. These are introduced to “shorten the feedback loop of easy-to-make mistakes when you’re starting to use NgRx, or even a well-seasoned developer might make.” The four runtime checks that have been added are: The strictStateImmutability check verifies whether a developer is trying to modify the state object. The strictActionImmutability check verifies that actions are not modified. The strictStateSerializability check verifies if the state is serializable. The strictActionSerializability check verifies if the action is serializable. All of these checks are opt-in and will be disabled automatically in production builds. Mock selectors for isolated unit testing NgRx 7 came with MockStore, a simpler way to condition NgRx state in unit tests. But, it does not allow isolated unit testing on its own. NgRx 8 combines mock selectors and MockStore to make this possible. You can use these mock selectors by importing @ngrx/store/testing. To know more in detail, check out the official announcement on Medium. ng-conf 2018 highlights, the popular angular conference Angular 8.0 releases with major updates to framework, Angular Material, and the CLI 5 useful Visual Studio Code extensions for Angular developers
Read more
  • 0
  • 0
  • 2888

article-image-scala-2-13-is-here-with-an-overhauled-collections-improved-compiler-performance-and-more
Bhagyashree R
12 Jun 2019
2 min read
Save for later

Scala 2.13 is here with overhauled collections, improved compiler performance, and more!

Bhagyashree R
12 Jun 2019
2 min read
Last week, the Scala team announced the release of Scala 2.13. This release brings a number of improvements including overhauled standard library collections, a 5-10% faster compiler, and more. Overhauled standard library collections The major highlight of Scala 2.13 is standard library collections that are now better in simplicity, performance, and safety departments as compared to previous versions.  Some of the important changes made in collections include: Simpler method signatures The implicit CanBuildFrom parameter was one of the most powerful abstractions in the collections library. However, it used to make method signatures too difficult to understand. Beginning this release, transformation methods will no longer take an implicit ‘CanBuildFrom’ parameter making the resulting code simpler and easier to understand. Simpler type hierarchy The package scala.collection.parallel is now a part of the Scala standard module. This module will now come as a separate JAR that you can omit from your project if it does not uses parallel collections. Additionally, Traversable and TraversableOnce are now deprecated. New concrete collections The Stream collection is now replaced by LazyList that evaluates elements in order and only when needed. A new mutable.CollisionProofHashMap collection is introduced that implements mutable maps using a hashtable with red-black trees in the buckets. This provides good performance even in worst-case scenarios on hash collisions. The mutable.ArrayDeque collection is added, which is a double-ended queue that internally uses a resizable circular buffer. Improved Concurrency In Scala 2.13, Futures are “internally redesigned” to ensure it provides expected behavior in a broader set of failures. The updated Futures will also provide a foundation for increased performance and support more robust applications. Changes in the language The updates in language include the introduction of literal-based singleton types, partial unification on by default, and by-name method arguments extended to support both implicit and explicit parameters. Compiler updates The compiler will now be able to perform a deterministic and reproducible compilation. This essentially means that it will be able to generate identical output for identical input in more cases. Also, operations on collections and arrays are now optimized making the compiler 5-10% better compared to Scala 2.12. These were some of the exciting updates in Scala 2.13. For a detailed list, check out the official release notes. How to set up the Scala Plugin in IntelliJ IDE [Tutorial] Understanding functional reactive programming in Scala [Tutorial] Classifying flowers in Iris Dataset using Scala [Tutorial]
Read more
  • 0
  • 0
  • 3716