Announcing General Availability of Power BI Real-Time Streaming Datasets

Today, I am happy to announce the general availability of real-time streaming datasets in Power BI. This feature set allows users to easily stream data into Power BI via the REST APIs, Azure Stream Analytics, and PubNub, and to see that data instantly light on their dashboards.

from Business Intelligence Blogs https://powerbi.microsoft.com/en-us/blog/announcing-general-availability-of-power-bi-real-time-streaming-datasets/

Announcing General Availability of Power BI Real-Time Streaming Datasets

Today, I am happy to announce the general availability of real-time streaming datasets in Power BI. This feature set allows users to easily stream data into Power BI via the REST APIs, Azure Stream Analytics, and PubNub, and to see that data instantly light on their dashboards.

from Microsoft Power BI Blog | Microsoft Power BI https://powerbi.microsoft.com/en-us/blog/announcing-general-availability-of-power-bi-real-time-streaming-datasets/

Data Science Virtual Machine updated, now includes RStudio, JuliaPro

The Windows edition of the Data Science Virtual Machine (DSVM) was recently updated on the Azure Marketplace. This update upgrades some existing components and adds some new ones as well.

You now have your choice of integrated development environment to use with R. RStudio Desktop is now included in the Data Science Virtual Machine image — no need to install it manually. And R Tools for Visual Studio has been upgraded to Version 0.5.
Microsoft R Server has also been upgraded to version 9.0. This includes the latest R 3.2.2 language engine, the RevoScaleR package for big-data support in R, and also the new MicrosoftML package with several new, high-performance machine learning techniques. If you choose a GPU-enabled NC classs instance for your DSVM, the MicrosoftML package can make use of the GPUs for even more performance. 
The DSVM also supports the Julia language with the inclusion of the JuliaPro distribution. This includes the Julia compiler and popular packages, a debugger, and the Juno IDE. If you’re new to Julia, the blog post Julia – A Fresh Approach to Numerical Computing provides an introduction. 
There are also improved Deep Learning capabilities, with the latest version of the Microsoft Cognitive Toolkit (formerly called CNTK). And if you use a GPU-enabled instance, the Deep Learning Toolkit extension provides GPU-enabled builds of the Cognitive Toolkit, mxNet, and TensorFlow. 
If you want to try out the Data Science Virtual Machine, the blog post linked below provides links to the documentation and several tutorials to get you started, along with information about the Linux edition of the DSVM.
Cortana Intelligence and Machine Learning Blog: New Year & New Updates to the Windows Data Science Virtual Machine
 

from Revolutions http://blog.revolutionanalytics.com/2017/01/dsvm-updated-now-includes-rstudio.html

Julia – A Fresh Approach to Numerical Computing

This post is authored by Viral B. Shah, co-creator of the Julia language and co-founder and CEO at Julia Computing, and Avik Sengupta, head of engineering at Julia Computing.

The Julia language provides a fresh new approach to numerical computing, where there is no longer a compromise between performance and productivity. A high-level language that makes writing natural mathematical code easy, with runtime speeds approaching raw C, Julia has been used to model economic systems at the Federal Reserve, drive autonomous cars at University of California Berkeley, optimize the power grid, calculate solvency requirements for large insurance firms, model the US mortgage markets and map all the stars in the sky

It would be no surprise then that Julia is a natural fit in many areas of machine learning. ML, and in particular deep learning, drives some of the most demanding numerical computing applications in use today. And the powers of Julia make it a perfect language to implement these algorithms.

julia

One of key promises of Julia is to eliminate the so-called “two language problem.” This is the phenomenon of writing prototypes in a high-level language for productivity, but having to dive down into C for performance-critical sections, when working on real-life data in production. This is not necessary in Julia, because there is no performance penalty for using high-level or abstract constructs.

This means both the researcher and engineer can now use the same language. One can use, for example, custom kernels written in Julia that will perform as well as kernels written in C. Further, language features such as macros and reflection can be used to create high-level APIs and DSLs that increase the productivity of both the researcher and engineer.

GPU

Modern ML is heavily dependent on running on general-purpose GPUs in order to attain acceptable performance. As a flexible, modern, high-level language, Julia is well placed to take advantage of modern hardware to the fullest.

First, Julia’s exceptional FFI capabilities make it trivial to use the GPU drivers and CUDA libraries to offload computation to the GPU without any additional overhead. This allows Julia deep learning libraries to use GPU computation with very little effort.

Beyond that, libraries such as ArrayFire allow developers to use natural-looking mathematical operations, while performing those operations on the GPU instead of the CPU. This is probably the easiest way to utilize the power of the GPU from code. Julia’s type and function abstractions make this possible with, once again, very little performance overhead.

Julia has a layered code generation and compilation infrastructure that leverages LLVM. (Incidentally, it also provides some amazing introspection facilities into this process.) Based on this, Julia has recently developed the ability to directly compile code onto the GPU. This is an unparalleled feature among high-level programming languages.

While the x86CPU with a GPU is currently the most popular hardware setup for deep learning applications, there are other hardware platforms that have very interesting performance characteristics. Among them, Julia now fully supports the Power platform, as well as the Intel KNL architecture.

Libraries

The Julia ecosystem has, over the last few years, matured sufficiently to materialize these benefits in many domains of numerical computing. Thus, there are a set of rich libraries for ML available in Julia right now. Deep learning framework with natural bindings to Julia include MXNet and TensorFlow. Those wanting to dive into the internals can use the pure Julia libraries, Mocha and Knet. In addition, there are libraries for random forests, SVMs, and Bayesian learning.

Using Julia with all these libraries is now easier than ever. Thanks to the Data Science Virtual Machine (DSVM), running Julia on Azure is just a click away. The DSVM includes a full distribution of JuliaPro, the professional Julia development environment from Julia Computing Inc, along with many popular statistical and ML packages. It also includes the IJulia system, with brings Jupyter notebooks to the Julia language. Together, it creates the perfect environment for data science, both for exploration and production.

Viral Shah
@Viral_B_Shah

from Cortana Intelligence and Machine Learning Blog https://blogs.technet.microsoft.com/machinelearning/2017/01/31/julia-a-fresh-approach-to-numerical-computing/

Julia – A Fresh Approach to Numerical Computing

This post is authored by Viral B. Shah, co-creator of the Julia language and co-founder and CEO at Julia Computing, and Avik Sengupta, head of engineering at Julia Computing.

The Julia language provides a fresh new approach to numerical computing, where there is no longer a compromise between performance and productivity. A high-level language that makes writing natural mathematical code easy, with runtime speeds approaching raw C, Julia has been used to model economic systems at the Federal Reserve, drive autonomous cars at University of California Berkeley, optimize the power grid, calculate solvency requirements for large insurance firms, model the US mortgage markets and map all the stars in the sky

It would be no surprise then that Julia is a natural fit in many areas of machine learning. ML, and in particular deep learning, drives some of the most demanding numerical computing applications in use today. And the powers of Julia make it a perfect language to implement these algorithms.

julia

One of key promises of Julia is to eliminate the so-called “two language problem.” This is the phenomenon of writing prototypes in a high-level language for productivity, but having to dive down into C for performance-critical sections, when working on real-life data in production. This is not necessary in Julia, because there is no performance penalty for using high-level or abstract constructs.

This means both the researcher and engineer can now use the same language. One can use, for example, custom kernels written in Julia that will perform as well as kernels written in C. Further, language features such as macros and reflection can be used to create high-level APIs and DSLs that increase the productivity of both the researcher and engineer.

GPU

Modern ML is heavily dependent on running on general-purpose GPUs in order to attain acceptable performance. As a flexible, modern, high-level language, Julia is well placed to take advantage of modern hardware to the fullest.

First, Julia’s exceptional FFI capabilities make it trivial to use the GPU drivers and CUDA libraries to offload computation to the GPU without any additional overhead. This allows Julia deep learning libraries to use GPU computation with very little effort.

Beyond that, libraries such as ArrayFire allow developers to use natural-looking mathematical operations, while performing those operations on the GPU instead of the CPU. This is probably the easiest way to utilize the power of the GPU from code. Julia’s type and function abstractions make this possible with, once again, very little performance overhead.

Julia has a layered code generation and compilation infrastructure that leverages LLVM. (Incidentally, it also provides some amazing introspection facilities into this process.) Based on this, Julia has recently developed the ability to directly compile code onto the GPU. This is an unparalleled feature among high-level programming languages.

While the x86CPU with a GPU is currently the most popular hardware setup for deep learning applications, there are other hardware platforms that have very interesting performance characteristics. Among them, Julia now fully supports the Power platform, as well as the Intel KNL architecture.

Libraries

The Julia ecosystem has, over the last few years, matured sufficiently to materialize these benefits in many domains of numerical computing. Thus, there are a set of rich libraries for ML available in Julia right now. Deep learning framework with natural bindings to Julia include MXNet and TensorFlow. Those wanting to dive into the internals can use the pure Julia libraries, Mocha and Knet. In addition, there are libraries for random forests, SVMs, and Bayesian learning.

Using Julia with all these libraries is now easier than ever. Thanks to the Data Science Virtual Machine (DSVM), running Julia on Azure is just a click away. The DSVM includes a full distribution of JuliaPro, the professional Julia development environment from Julia Computing Inc, along with many popular statistical and ML packages. It also includes the IJulia system, with brings Jupyter notebooks to the Julia language. Together, it creates the perfect environment for data science, both for exploration and production.

Viral Shah
@Viral_B_Shah

from Cortana Intelligence and Machine Learning Blog https://blogs.technet.microsoft.com/machinelearning/2017/01/31/julia-a-fresh-approach-to-numerical-computing/

Slack finally launches its enterprise edition

After a long wait, Slack has announced the version of its popular work chat application that is designed for enterprises. On Tuesday in San Francisco, the company unveiled its new Enterprise Grid product, aimed at helping companies administer and connect multiple chat instances.

Grid allows administrators to set up each team inside an organization with its own centrally managed Slack instance. Those workspaces can then be linked together using shared channels, and all of the people inside an enterprise can direct-message one another, even if they’re not part of the same workspace.

To read this article in full or to leave a comment, please click here

from Computerworld Cloud Computing http://www.computerworld.com/article/3163495/enterprise-applications/slack-finally-launches-its-enterprise-edition.html#tk.rss_cloudcomputing

Julia – A Fresh Approach to Numerical Computing

This post is authored by Viral B. Shah, co-creator of the Julia language and co-founder and CEO at Julia Computing, and Avik Sengupta, head of engineering at Julia Computing.

The Julia language provides a fresh new approach to numerical computing, where there is no longer a compromise between performance and productivity. A high-level language that makes writing natural mathematical code easy, with runtime speeds approaching raw C, Julia has been used to model economic systems at the Federal Reserve, drive autonomous cars at University of California Berkeley, optimize the power grid, calculate solvency requirements for large insurance firms, model the US mortgage markets and map all the stars in the sky

It would be no surprise then that Julia is a natural fit in many areas of machine learning. ML, and in particular deep learning, drives some of the most demanding numerical computing applications in use today. And the powers of Julia make it a perfect language to implement these algorithms.

julia

One of key promises of Julia is to eliminate the so-called “two language problem.” This is the phenomenon of writing prototypes in a high-level language for productivity, but having to dive down into C for performance-critical sections, when working on real-life data in production. This is not necessary in Julia, because there is no performance penalty for using high-level or abstract constructs.

This means both the researcher and engineer can now use the same language. One can use, for example, custom kernels written in Julia that will perform as well as kernels written in C. Further, language features such as macros and reflection can be used to create high-level APIs and DSLs that increase the productivity of both the researcher and engineer.

GPU

Modern ML is heavily dependent on running on general-purpose GPUs in order to attain acceptable performance. As a flexible, modern, high-level language, Julia is well placed to take advantage of modern hardware to the fullest.

First, Julia’s exceptional FFI capabilities make it trivial to use the GPU drivers and CUDA libraries to offload computation to the GPU without any additional overhead. This allows Julia deep learning libraries to use GPU computation with very little effort.

Beyond that, libraries such as ArrayFire allow developers to use natural-looking mathematical operations, while performing those operations on the GPU instead of the CPU. This is probably the easiest way to utilize the power of the GPU from code. Julia’s type and function abstractions make this possible with, once again, very little performance overhead.

Julia has a layered code generation and compilation infrastructure that leverages LLVM. (Incidentally, it also provides some amazing introspection facilities into this process.) Based on this, Julia has recently developed the ability to directly compile code onto the GPU. This is an unparalleled feature among high-level programming languages.

While the x86CPU with a GPU is currently the most popular hardware setup for deep learning applications, there are other hardware platforms that have very interesting performance characteristics. Among them, Julia now fully supports the Power platform, as well as the Intel KNL architecture.

Libraries

The Julia ecosystem has, over the last few years, matured sufficiently to materialize these benefits in many domains of numerical computing. Thus, there are a set of rich libraries for ML available in Julia right now. Deep learning framework with natural bindings to Julia include MXNet and TensorFlow. Those wanting to dive into the internals can use the pure Julia libraries, Mocha and Knet. In addition, there are libraries for random forests, SVMs, and Bayesian learning.

Using Julia with all these libraries is now easier than ever. Thanks to the Data Science Virtual Machine (DSVM), running Julia on Azure is just a click away. The DSVM includes a full distribution of JuliaPro, the professional Julia development environment from Julia Computing Inc, along with many popular statistical and ML packages. It also includes the IJulia system, with brings Jupyter notebooks to the Julia language. Together, it creates the perfect environment for data science, both for exploration and production.

Viral Shah
@Viral_B_Shah

from Cortana Intelligence and Machine Learning Blog https://blogs.technet.microsoft.com/machinelearning/2017/01/31/julia-a-fresh-approach-to-numerical-computing/