Tuesday, August 23, 2016

How to create websites with MySQL database in Azure


Microsoft recently announced Azure App Service support for In-app MySQL Feature (Still in Preview).

What does "In-App MySQL" in App Service mean?

It means that MySQL database is provisioned and it shares the resources with your web app. MySQL in-app enables developers to run the MySQL server side-by-side with their Web application within the same environment, which makes it easier to develop and test PHP applications that use MySQL.

So, You can have your MySQL In-App database along with your website into Azure App Service and both share the same resources. No need to provision a different VM for MySQL or purchase ClearDB for your websites under development. The feature is available for new or existing web apps in Azure.

Definitely we recommend when moving to production is to move out of In-App MySQL database, since the intention is to keep this for development and testing purposes only.

In-App MySQL is like hosting SQLServer Express DB instance in your app before mounting it to an actual SQL Server instance.

How to provision MySQL In-App to Azure App Service?

Create a new web app or select an existing web app and then you will find "MySQL in App (Preview)" option. Click MySQL In App On and then save.

Current Limitation for MySQL In App Feature:
1) Auto Scaling feature is not supported.
2) Enabling Local Cache is not supported.
3) You can access your database only using PHPMyAdmin web tool or using KUDU debug console.
4) Web Apps and WordPress templates support MySQL In App when you provision it in Azure Portal. The team is working to expand this to other services in Azure portal.

Hope this helps.

1) MySQL in-app for web apps: https://blogs.msdn.microsoft.com/appserviceteam/2016/08/18/announcing-mysql-in-app-preview-for-web-apps

Monday, August 22, 2016

Avro vs Parquet vs ORCFile as Hadoop storage files

While working on developing big data applications and systems in Hadoop. Every time we store data in Hadoop cluster, we think about what is the best way to store our data. There are tons of challenges when storing Petabytes of data including what is the required storage and how to faster reads your data!

In Hadoop, you can store your files in many formats. I would like to share some of these options and which to be used in certain scenarios.

How to store data files in Hadoop and what are the available options:

Apache Avro™ is a data serialization system. Avro provides a row-based data storage, while the schema is encoded on the file and it provides binary data serialization.

Use Case: Use Avro if you have a requirement to support binary data serialization for your data while maintaining a self contained schema on a row-based data files.

Read more about Avro: http://avro.apache.org/

Apache Parquet is a columnar storage format available to any project in the Hadoop ecosystem, regardless of the choice of data processing framework, data model or programming language.

Use Case: You want to store data on a column based files & save on storage. Parquet uses an efficient encoding & compression data representation (schemas) on your Hadoop clusters.
It works with different processing framework and programming languages. 

Read more about Apache Parquet: https://parquet.apache.org/

3) ORCFile
Apache Orc is the smallest, fastest columnar storage for Hadoop workloads. ORC is a self-describing type-aware columnar file format designed for Hadoop workloads. It is optimized for large streaming reads, but with integrated support for finding required rows quickly. Storing data in a columnar format lets the reader read, decompress, and process only the values that are required for the current query. Because ORC files are type-aware, the writer chooses the most appropriate encoding for the type and builds an internal index as the file is written.

Use Case: Use ORC when you need to store your data on columnar storage in Hadoop in an efficient and faster way to retrieve your data. ORCFile contains its schema which makes reading values is so fast.

Read more about Apache Arc:https://orc.apache.org/

Hope this helps!

Wednesday, August 17, 2016

How to read images from a URL in asp.net core


I was building an asp.net core web api that suppose to read images from an external url. Even though i have done this dozens of time. I got stuck for a bit trying to have the same code that reads an image from a url into my asp.net core project using Visual Studio 2015.

After little bit of searching, i found out that before trying to read a static file such as an image from your controller. you need to enable first Directory browsing and configure routing path so you are able to view this image in a browser by hitting the url of the image.

So, follow these below steps to be able to read images from a url (in my case these images were part of project):

1) Move images folder (or any static files folder) under wwwroot folder.
2) Open startup.cs file and enable directory browsing.

C# code to enable directory browsing and serving static files:

public void Configure(IApplicationBuilder app, IHostingEnvironment env, ILoggerFactory loggerFactory)

            // Enable directory browsing
            app.UseStaticFiles(new StaticFileOptions()
                FileProvider = new PhysicalFileProvider(
           Path.Combine(Directory.GetCurrentDirectory(), @"wwwroot\images")),
                RequestPath = new PathString("/images")
            app.UseDirectoryBrowser(new DirectoryBrowserOptions()
                FileProvider = new PhysicalFileProvider(
            Path.Combine(Directory.GetCurrentDirectory(), @"wwwroot\images")),
                RequestPath = new PathString("/images")


3) Run your app and try to load an image from the browser, for example:

4) You will be able to view the image in the browser. now, let's read image in C# from a url.

 // read from remote image drive
                    using (HttpClient c = new HttpClient())
                        using (Stream s = await c.GetStreamAsync(imgUrl))
                             // do any logic with the image stream, save it, store it...etc.

If you haven't done step #3 (This is were i got stuck!), the GetStreamAsync method will throw an exception (404 not found) error because we haven't configured the app to deliver static files.

Hope this helps!

1) Working with static files in asp.net core:

Wednesday, August 03, 2016

Building Big Data Solutions using Hadoop in Azure

Hi All,

Today i am at New York City presenting how to build data solutions in Azure. The presentation is focused on the underling technologies and tools that are needed to build big data solutions.

The session also covers the following:

1) What HDInsight cluster offers in hadoop ecosystem technology stack.
2) HDInsight cluster tiers and types.
3) HDInsight developer tools in Visual Studio 2015.
4) Working with HBase databases and Hive View.
5) Building, Debugging and Deploying Storm Apps.
6) Working with Spark clusters.

Session Title: Building Big Data Solutions in Azure.

Session Details:
The session covers how to get started to build big data solutions in Azure. Azure provides different Hadoop clusters for Hadoop ecosystem. The session covers the basic understanding of HDInsight clusters including: Apache Hadoop HDFS, HBase, Storm and Spark. The session covers how to integrate with HDInsight in .NET using different Hadoop integration frameworks and libraries. The session is a jump start for engineers and DBAs with RDBMS experience who are looking for a jump start working and developing Hadoop solutions. The session is a demo driven and will cover the basics of Hadoop open source products.

Event Url:  http://www.html5report.com/conference/newyork/agenda.aspx?t=#D1-8

Hope this helps!

Tuesday, August 02, 2016

Working with Hive in HDInsight


While i am working on building big data solutions in Azure HDInsight clusters. I found out really new tools that have been added to HDP to easily help you working with Hive and HBase datastores.

In this blog post, I would like to share that you can manage your Hive databases and queries using Hive View in HDInsight clusters.

I have provisioned a Linux based Spark cluster in HDInsight. Spark clusters comes with a preloaded tools, frameworks and services. Hive service is preloaded and configured by default as well.

Follow these steps to work with Hive:

1) From Azure Portal, select your HDInsight cluster.
2) Click on Dashboard.
3) Enter your admin username and password.
4) This would be Ambari homepage for your cluster.

5) From the top right corner, click on Hive View.

6) You will be able to write any SQL statements in Hive query as you used to do.

Hive view also contains other capabilities such as defining UDFs and upload tables to Hive.

Hope this helps.

Wednesday, July 20, 2016

Easily construct Outlook Group Connector JSON messages in C#

Hi All,

If you are building an Outlook Group Connector that you are spending a lot of time writing JSON message & specifying different schema elements and attributes to be able to build a canvas that looks like this below figure, so i got good news for you!

I got your back and published an Outlook Group Connector SDK ver. 1.1 nuget package that includes tons of extension methods and features that helps your easily build your JSON payload message.

How to send a message in C# to a group:

                Message message = new Message()
                    summary = "This is the subject for the sent message to an outlook group",
                    title = msg
                message.AddFacts("Facts", facts);
                message.AddImages("Images", images);
                message.AddAction("check details here", "http://mostafaelzoghbi.com");

                var result = await message.Send(webhookUrl);

GitHub Code and Sample links:

1) GitHub Repo for SDK and Samples apps including console & web apps (link).
2) NuGet package that has been published to use it in your apps (link) or search for "Office365ConnectorSDK" in VS 2015.

Hope this helps.

Friday, July 15, 2016

Get started with Outlook Connectors with a sample showcase application

Hi All,
Office 365 Connectors provide a compelling extensibility solution for developers. Developers can build connectors through incoming webhooks to generate rich connector cards. Additionally, with the new "Connect to Office 365" button, developers can embed the button on their site and enable users to connect to Office 365 groups.

A sample showcase for outlook connectors integration
I have built this application that demonstrates outlook connector integration showcase that includes an integration for "Connect to Office 365" button into a third party website and how to send a detailed canvas message to a group.
How to Use it:
  • Outlook Connector landing page: Click on "Enterprise" menu item, install our connector into one of your office 365 groups.
  • Send a message to any group: Click on "Send Message" menu item, set a title message and group name and click on Send button. Check your group and you will be notified with a full detailed canvas message.

Useful Resources: 

A general overview of what Office 365 Connectors are and how end-users interact with them.

Complete documentation for building Office 365 Connectors.

A sandbox environment for developer experimentation.

Create and manage outlook connector settings in this dashboard.