Wednesday, March 01, 2017

How to set storage account connection string in Azure Functions

Hi,

I was developing an Azure Function App that connects to an Azure blob storage. After setting up the binding for my blob storage account. I got the following error message when running my function app:







The error message in the screen shot above suggests three options to fix this. I will walk through how to implement the first option as one of the available solutions. The first solution is to set the connection string name in the appsettings.json file so it will look like this.

appsettings.json
{
  "IsEncrypted": false,
  "Values": {
    "AzureWebJobsStorage": "",
    "AzureWebJobsDashboard": "",
    "AzureWebJobsmofunctions_STORAGE": "DefaultEndpointsProtocol=https;AccountName=mofunctions;AccountKey=KEY"
  }
}

function.json (Just a section where i define my blob binding info)

 {
      "type": "blob",
      "name": "iBlob",
      "path": "mydata/file1.csv",
      "connection": "mofunctions_STORAGE",
      "direction": "in"
    }

NOTE: 
You will notice that the connection name value in function.json is a suffix for AzureWebJobs key in the appsettings.json file.


Once you set this, Press F5 and you will be able to connect and read blob contents from Azure storage accounts.



Enjoy!

Thursday, January 26, 2017

Mashing RDDs in Apache Spark from RDBMs perspective

Hi,

Happy new year! This is my first post in 2017!. 2016 was amazing year for me. lots of work, projects and achievements. Looking forward to 2017.

I am writing this blog post to cover the standard techniques to work with Resilient Distributed Datasets (RDDs) to join data in Apache Spark.


I would like to share some insights when working with RDDs in Spark. That's related to how to work with multiple RDDs as we do when working with relational database management systems.

Apache Spark support joins in RDDs, where you can implement all kinds of joins that we are aware of in RDBMS. Below i will list how would you implement this on this platform.

Apache Spark Join Transformations Operations:

1) join: This is equivalent to inner join in RDBMs. It returns a new pair RDD with the elements containing all possible pairs of values from the first and second RDDs that has the same keys. For the keys that exist in only one of the the two RDDs. the resulting RDD will have no elements.

2) leftOuterJoin: This is equivalent to left outer join in RDBMs. The resulting RDD will also contain the elements for those keys that don't exist in the second RDD.

3) rightOuterJoin: This is equivalent to right outer join in RDBMs. The resulting RDD will also contain the elements for those keys that don't exist in the first RDD.

4) fullOuterJoin: This is equivalent to cross join in RDBMs. The resulting RDD will also contain the elements for both keys that exist in either RDDs.

In case of the RDDs contain duplicate keys, these keys will be joined multiple times.


Hope this helps!




Friday, October 14, 2016

Introducing Power BI Embedded talk at Cloud Summit


Hi All,

Earlier today, I have presented "Introducing Power BI Embedded" top that covers platform capabilities and tools in Cloud Summit event at Microsoft Chevy Chase office.

The session covered Power BI Platform capabilities, tools and Power BI Embedded as PaaS option in Microsoft Cloud platform.

I have got a lot of questions about Power BI data set scheduling, working with data capabilities including direct queries vs. import options while authoring reports in Power BI desktop. I also covered the need for Power BI Gateway for hybrid scenarios.





Enjoy.

Thursday, October 13, 2016

Fixing powerbi.d.ts missing modules errors in Visual Studio 2015

Hi,

While i was working on Visual Studio 2015; I have got this errors due to missing modules in powerbi.d.ts file. I have an ASP.NET MVC project that uses Power BI Embedded and i would like to get this application up and running but i am getting these errors while building my app.












These errors are due to missing TypeScript tools for Visual Studio 2015. Once you install them, you will be able to run your app and all these errors disappear.


To fix this problem, follow these steps:


  • Open Tools | Extensions and Updates.
  • Select Online in the tree on the left.
  • Search for TypeScript using the search box in the upper right.
  • Select the most current available TypeScript version.
  • Download and install the package.
  • Build your project!

Hope this helps.

Monday, September 19, 2016

Extending Product Outreach with Outlook Connectors

Hi All,

I presented last Saturday at SharePoint Detroit a talk with title "Extending Product Outreach with Outlook Connectors"; Since i covered how to utilize office 365 groups to extend product outreach using outlook group connectors with demos.

Session Description:

Office 365 Connectors is a brand new experience that delivers relevant interactive content and updates from popular apps and services to Office 365 Groups. We are now bringing this experience to you, our Office 365 customers. Whether you are tracking a Twitter feed, managing a project with Trello or watching the latest news headlines with Bing—Office 365 Connectors surfaces all the information you care about in the Office 365 Groups shared inbox, so you can easily collaborate with others and interact with the updates as they happen. Session will cover how to build your office 365 connectors and how to work with Microsoft to help you build your company one.


Thursday, September 08, 2016

Build Intelligent Microservices Solutions using Azure

Hi All,

I had the pleasure last night to present at one of our local user groups to talk about building intelligent microservices in Azure.

The session covers in detail how to build intelligent microservices solutions using Cloud Services including web and worker roles, Azure App Service features in Azure & Service Fabric. The session was a demo driven and i demonstrated how to design and provision complete end-to-end solutions using cloud services using web roles, worker roles and service bus in Azure.
I also covered Azure App Service capabilities that help developers to scale and monitor production applications; in addition to setup continuous deployment.

Session objectives and takeaways:

  1. Benefits of creating micro services in the cloud
  2. End-To-End Use case for building cloud service with web & worker roles with service bus integration
  3. Azure App Service intelligent features including troubleshooting, CI, back up, routing, scheduling & other features
  4. Azure Service Fabric microservices platform


The presentation is posted below.

Wednesday, August 31, 2016

Building Big Data Solutions in Azure Data Platform @ Data Science MD

Hi All,

Yesterday i was at Johns Hopkins University in Laurel, MD presenting how to build big data solutions in Azure. The presentation was focused on the underling technologies and tools that are needed to build end to end big data solutions in the cloud. I presented the capabilities that Azure offers out of the box in addition to cluster types and tiers that are available for ISVs and developers.

The session covers the following:

1) What HDInsight cluster offers in hadoop ecosystem technology stack.
2) HDInsight cluster tiers and types.
3) HDInsight developer tools in Visual Studio 2015, HDInsight developer tools.
4) Working with HBase databases and Hive View, deploying Hive apps from Visual Studio.
5) Building, Debugging and Deploying Storm Apps into Storm clusters.
6) Working with Spark clusters using Jupyter, PySpark.


Session Title: Building Big Data Solutions in Azure Data Platform

Session Details:
The session covers how to get started to build big data solutions in Azure. Azure provides different Hadoop clusters for Hadoop ecosystem. The session covers the basic understanding of HDInsight clusters including: Apache Hadoop HDFS, HBase, Storm and Spark. The session covers how to integrate with HDInsight in .NET using different Hadoop integration frameworks and libraries. The session is a jump start for engineers and DBAs with RDBMS experience who are looking for a jump start working and developing Hadoop solutions. The session is a demo driven and will cover the basics of Hadoop open source products.