Blog

Enabling a direct export to a Secure, Scalable and Reliable AWS S3 storage will make your Logi Analytics application even more powerful. Contents stored in an S3 bucket can be secured with different security option like bucket policies and permissions to make it available to individuals or group of people or to the entire internet. S3 guarantees 11 9’s durability and 99.99% availability with virtually unlimited storage.

This blog will walk you through 9 easy steps to build an AWS S3 export plugin and use it within a Logi Analytics application.  You first need to create and build the Export to S3 Plugin (a dll), and then call this plugin within your Logi Analytics application with all the required parameters. Our example shows how to create and use a .NET dll to check for an existing file and copy files to AWS S3. (Note that our example uses C#.NET, however the same plugin can be built in Java).

Creating the Plugin (dll)

  1. Add the reference for the rdPlugin.dll, which is located in the Logi application folder \bin directory. and the following AWS DLL references are required
    • AWSSDK.Core
    • AWSSDK.S3

    AWS S3 Plugin for Logi Analytics Application

  2. Include the following namespaces in the class:
    • rdPlugin
    • Amazon
    • Amazon.S3
    • Amazon.S3.Transfer

    AWS S3 Plugin for Logi Analytics Application

  3. Use the sendFileToS3 method, which will be called from the Logi Analytics application. (This method must be public and can have only one ref input parameter of type rdServerObjects.)
    AWS S3 Plugin for Logi Analytics Application

    • Note that any parameters that are added in the Logi plugin call are encapsulated under rdObjects.PluginParameters and can be accessed as shown below (rdObjects, the ref parameter, is passed to the method.)

    AWS S3 Plugin for Logi Analytics Application

  4. Initialize the S3 client and transfer utility as shown below. You need to initialize the S3Client utility with the AccessKeyID, SecretAccessKey, and Region parameters. (These can be null if the proper bucket policy is set or if the appropriate role is set in the EC2 instance.)
    AWS S3 Plugin for Logi Analytics Application
  5. Optionally, you can check for the existence of the file and provide a warning about overwrite (otherwise, files can get overwritten by default).
    AWS S3 Plugin for Logi Analytics Application
  6. Transfer the file to S3 as shown below:
  7. Build the dll with the following namespace and class, and then copy the S3Utility, AWSSDK.S3, and AWSSKD.Core DLLs to the Logi application folder \bin directory.
    AWS S3 Plugin for Logi Analytics ApplicationAWS S3 Plugin for Logi Analytics Application

Using the Plugin in Logi Application

  1. Note the class type name S3Utility.AMZS3Utility (namespace.classname)You are now all set to use the plugin in the Logi app. Calling the plugin method sendFiletoS3 from a task is as simple as this
    AWS S3 Plugin for Logi Analytics Application
  2. Pass the parameters to the plugin as shown below (these were the parameters accessed in the code example in Step 4 above).
    1. awsAccessKeyID and awsSecretAccessKey are required to authenticate and process the S3 request
    2. as shown in this example, with an adjusted S3 bucket policy the same code can export to S3 without awsAccessKeyID and awsSecretAccessKey
      AWS S3 Plugin for Logi Analytics Application

 

 Like

A descriptive dashboard essentially answers the question, “What happened?” It provides analytics that describe and summarize past events and trends, such as: who are my highest paying customers, what were the sales over the past month, quarter or year; what was the customer churn during a specific time frame (such as during a major organization shift). The target audience includes executives and upper management who want to look at the bigger picture as well as managers who would like to identify potential areas of concern and to then drill down to diagnose the issue.

So what types of analytics or reports should you build for a descriptive dashboard?

  • Key indicators on the top for a quick review. (The main purpose of a dashboard is to provide instant access to information after all.)
  • Supporting high level charts or table summaries that describe the KPIs.  If used with numerical KPIs, descriptive data can provide context to help you effectively make quick decisions.
  • Charts/tables to show:
    • Top N categories
    • Trends on a timeline
    • Comparison of top categories

So that your descriptive dashboard is most useful, you should consider these important factors when planning and creating your dashboard.

• The right key performance indicators (KPIs) should be identified and displayed. At a quick glance, they should describe the main essence of the dashboard.
• Charts and tables should support those key indicators. They should show how the key indicator values were derived.
• Charts and tables should be small and simple. They should show top/bottom, 5/10. They should not show a list of everything.
• Key filters should be available to filter the high-level data. Also, these filters should apply to the entire dashboard and not individual elements.
• Capability to drill down from the high level data to view details must be available.

 Like
Dashboards are immensely useful for businesses, because they can help give you key analytics to understand your business at a glance; they can help you communicate the right information to key stakeholders and users; and they can make your employees more efficient by giving them actionable information at their fingertips.But to get any or all of these benefits, there are seven questions that you should ask before designing a business dashboard. If you ask these questions, you will be on the right track to delivering a useful dashboard that is adopted and used by the right people in your business.
 Like

We are excited to announce our sponsorship of a local cricket team, the Stormers. Our very own dbSeer team member, Nanda Kumar, is not only an avid player, but is team captain.

stormers_1

The team formed in the Fall of 2012 and plays two seasons per year, in the Spring and the Fall. The Stormers are part of the Loudoun County Cricket League (LCCL)  http://cricclubs.com/vlccl as well as the Potomac Cricket League (PCL)  http://cricclubs.com/PCL, both hard tennis ball cricket leagues near Leesburg, VA.

It’s fun for us to follow the team as they play throughout the year and help to bring more fans locally to experience the sport of cricket, and we wish them luck on a successful season. You can check out the league web pages to find out current game schedules.  Come on out and watch the Stormers play!

 Like
Has BI reached its peak?

Has BI reached its peak?I recently read a piece published on March 30, 2016 by the siliconangle.com web site questioning whether traditional business intelligence (BI) has reached its peak. It raised a couple of interesting points. Due to the recent “flattening of consumption of business intelligence” and the consolidation of BI players (via mergers and acquisitions, for example), some analysts are proposing that this signals BI has reached its peak. In an interview by siliconangle.com with Ian Andrews, VP of Products at Pivotal, Inc., Mr. Andrews explained that this implies the industry is just shifting customers among vendors and that we may be “hitting the peak platform for data.” He argues, however, that this is not the case and that “we’re just at the beginning from a platform perspective.” He further states that consumption of data is where it’s now at, with delivering “information in context.”

I would agree with him that BI is shifting toward extracting the complexity out of analytics and pushing “information in context.” This means pushing out information to be consumed and used by any user in the organization, and embedding the relevant analytics into the daily applications that users use to complete daily workflow, so that business processes can be improved right when they need to be. We are shifting toward delivering the right information, right when users need it and right where they need it. Further, analytics are becoming critical to the more non-traditional stakeholders throughout the organization, switching from IT-led, system-of- record reporting to business-led analytics.

In my opinion, the recent self- service push in the analytics market has gone too far, and, while it has a role to play in the “information in context” model, it will not be the only way toward the “democratization” of analytics. Embedded BI and custom BI applications will be key components of “information in context.” The democratization of analytics will continue to be the focus in the organization, and BI has certainly not reached its peak. If anything, the opportunities for information in context are far and wide.

Click here for the full interview on siliconangle.com.

 Like

In March, I had the privilege of attending the premiere big data conference O’Reilly and Cloudera Strata + Hadoop World in San Jose, CA. I’ll describe some of the more interesting topics/sessions in more detail below. The key technology areas and trends that seemed to be the focus were around:

  • Machine learning
  • Streaming and real-time data processing
  • IoT
  • Real-time Analytics

Hadoop Clusters – There were a few sessions/talks about the challenges of managing Hadoop clusters in Enterprise environments. I attended two talks about the topic by GE and British Telecom. GE’s talk focused on how to use a big data platform to change the enterprise culture to a data-driven culture, by opening up the data and creating data lakes where data is accessible by businesses. The BT talk was about successful design patterns for data hubs. Both talks highlighted the enterprise approach on data lakes or data hubs and the Hadoop challenge of job management within the cluster.

IoT – Intel hosted a session showcasing a data-streaming platform that helped Levi Strauss to find its items in a store. This solutions used RFID (IoT) on each item in the store and a machine learning algorithm (that learns over time) where each item should be located in the store. While the application of the technology was a bit simplistic, the platform itself was very impressive.

Machine Learning – Microsoft hosted a talk on machine learning, in which they showcased research on machine learning and neuroscience. Remarkably, they have developed an algorithm that is able to identify basic thoughts just by analyzing electrical signals released from the brain; in this case, the algorithm was able to identify if an individual was seeing a face or a building. They showed images in a few milliseconds to a patient, and the computer, with over 90% accuracy, could guess what picture the patient saw.

Real-time analytics – Something that came up on in a few sessions was the challenge of applying real-time analytics to massive data. There was one use case on credit card security fraud discussed by MapR.  Combining streaming technology and machine learning, MapR impressively made under a 1-second decision to determine if a transaction is a fraud transaction.

 

Future of Data Analytics

Another key takeaway from the conference included an overview of the future of data analytics. The following diagram created by Amplab – UC Berkely is a great representation. In summary, it shows from the bottom level to the top:

  • Virtualization/distributed file system at the lowest level
  • Compression and encryption at the storage level
  • Spark as the processing engine (Notice no alternative for spark!)
  • Access – Still too many options (I think this is the issue that need to be resolved there is no clear way to access)
  • Applications

Data Analytics Diagram

 

 Like