SAS Visual Analytics

2月 082019
 

Creating a map with SAS Visual Analytics begins with the geographic variable.  The geographic variable is a special type of data variable where each item has a latitude and longitude value.  For maximum flexibility, VA supports three types of geography variables:

  1. Predefined
  2. Custom coordinates
  3. Custom polygons

This is the first in a series of posts that will discuss each type of geography variable and their creation. The predefined geography variable is the easiest and quickest way to begin and will be the focus of this post.

SAS Visual Analytics comes with nine (9) predefined geographic lookup types.  This lookup method requires that your data contains a variable matching one of these nine data types:

  • Country or Region Names – Full proper name of a country or region (ISO 3166-1)
  • Country or Region ISO 2-Letter Codes – Alpha-2 country code (ISO 3166-1)
  • Country or Region ISO Numeric Codes – Numeric-3 country code (ISO 3166-1)
  • Country or Region SAS Map ID Values – SAS ID values from MPASGFK continent data sets
  • Subdivision (State, Province) Names – Full proper name for level 2 admin regions (ISO 3166-2)
  • Subdivision (State, Province) SAS Map ID Values – SAS ID values from MAPSGFK continent data sets (Level 1)
  • US State Names – Full proper name for US State
  • US State Abbreviations – Two letter US State abbreviation
  • US Zip Codes – A 5-digit US zip code (no regions)

Once you have identified a variable in your dataset matching one of these types, you are ready to begin.  For our example map, the dataset 'Crime' and variable 'State name' will be used.  Let’s get started.

Creating a predefined geography variable in SAS Visual Analytics

  1. Begin by opening VA and navigate to the Data panel on the left of the application.
  2. Select the desired dataset and locate a variable that matches one of the predefined lookup types discussed above. Click the down arrow to the right of the variable and select ‘Geography’ from the Classification dropdown menu.
  3. The ‘Edit Geography Item’ window will open. Depending upon the type of geography variable selected, some of the options on this dialog will vary.  The 'Name' textbox is common for all types and will contain the variable selected from your dataset.  Edit this label as needed to make it more user friendly for your intended audience.
  4. The ‘Geography data type’ drop down list is where you select the desired type of geography variable.  In this example, we are using the default predefined option.
  5. Locate the 'Name or code context' dropdown list.  Select the type of predefined variable that matches the data type of the variable chosen from your data.  Once selected, VA scans your data and does an internal lookup on each data item.  This process identifies latitude and longitude values for each item of your dataset.  Lookup results are shown on the right of the window as a percentage and a thumbnail size map.  The thumbnail map displays the the first 100 matches.
  6. If there are any unmatched data items, the first 5 will be displayed.  This may provide a better understanding of your data.  In this example, it is clear from variable name as to what type should be selected (US State Names).  However, in most cases that choice will not be this obvious.  The lesson here, know your data!

Unmatched data items indicators

Once you are satisfied with the matched results, click the OK button to continue.  You should see a new section in the Data panel labeled ‘Geography’.  The name of the variable will be displayed beside a globe icon. This icon represents the geography variable and provides confirmation it was created successfully.

Icon change for geography variable

Now that the geography variable has been created, we are ready to create a map.  To do this, simply drag it from the Data panel and drop it on the VA report canvas.  The auto-map feature of VA will recognize the geography variable and create a bubble map with an OpenStreetMap background.  Congratulations!  You have just created your first map in VA.

Bubble map created with predefined geography variable

The concept of a geography variable was introduced in this post as the foundation for creating all maps in VA.  Using the predefined geography variable is the quickest way to get started with Geo maps.  In situations when the predefined type is not possible, using one of VA's custom geography types becomes necessary.  These scenarios will be discussed in future blog posts.

Fundamentals of SAS Visual Analytics geo maps was published on SAS Users.

2月 022019
 

SAS Visual Analytics

I don't know about you, but when I read challenges like:

  • Detecting hidden heart failure before it harms an individual
  • Can SAS Viya AI help to digitalize pension management?
  • How to recommend your next adventure based on travel data
  • How to use advanced analytics in building a relevant next best action
  • Can SAS help you find your future home?
  • When does a customer have their travel mood on, and to which destination will he travel?
  • How can SAS Viya, Machine Learning and Face Recognition help find missing people?

…I can continue with the list of ideas provided by the teams participating in the SAS Nordics User Group’s Hackathon. But one thing is for sure, I become enthusiastic and I'm eager to discover the answers and how analytics can help in solving these questions.

When the Nordics team asked for support for providing SAS Viya infrastructure on Azure Cloud platform, I didn't hesitate to agree and started planning the environment.

Environment needs

Colleagues from the Nordics countries informed us their Hackathon currently included fourteen registered teams. Hence, they needed at least fourteen different environments with the latest and greatest SAS Viya Tools like SAS Visual Analytics, SAS VDMML and SAS Text Analytics. In addition, participants wanted to get the chance to use open source technologies with SAS and asked us to install R-Studio and Jupyter. This would allow data scientists develop models in a programming language of choice and provide access to SAS predictive modeling capabilities.

The challenge I faced was how to automate this installation process. We didn't want to repeat an exact installation fourteen times! Also, in case of a failure we needed a way to quickly reinstall a fresh virtual machine in our environment. We wanted to create the virtual machines on the Azure Cloud platform. The goal was to quickly get SAS Viya instances up and running on Azure, with little user interaction. We ended up with a single script expecting one parameter: the name of the instance. Next, I provide an overview of how we accomplished our task.

The setup

As we need to deploy fourteen identical copies of the same SAS Viya software, we decided to make use of the SAS Mirror Manager, which is a utility for synchronizing SAS software repositories. After downloading the mirror repository, we moved the complete file structure to a Web Server hosted on a separate Nordics Hackathon repository virtual machine, but within a similar private network where the SAS Viya instances will run. This guarantees low latency when downloading the software.

Once the repository server is up and running, we have what we needed to create a SAS Viya base image. Within that image, we first need to make sure to meet the requirements described in the SAS Viya Deployment Guide. To complete this task, we turned to the Viya Infrastructure Resource Kit (VIRK). The VIRK is a collection of tools, created by Erwan Granger, that assist in infrastructure and readiness-verification tasks. The script is located in a repository on SAS software’s GitHub page. By running the VIRK script before creation of the base image, we guarantee all virtual machines based on the image meet the necessary requirements.

Next, we create within the base image the SAS Viya Playbook as described in the SAS Viya Deployment Guide. That allows us to kick off a SAS Viya installation later. The Viya installation must occur later during the initial launch of a new VM based on that image. We cannot install SAS Viya beforehand because one of the requirements is a static IP address and a static hostname, which is different for each VM we launch. However, we can install R-Studio server on the base image. Another important file we make available on this base image is a script to initiate the Ansible installations of OpenLdap, SAS Viya and Jupyter.

Deployment

After the common components are in place we follow the instructions from Azure on how to create a custom image of an Azure VM. This capability is available on other public cloud providers as well. Now all the prerequisites to create working Viya environments for the Hackathon are complete. Finally, we create a launch script to install a full SAS Viya environment with single command and one parameter, the hostname, from the Azure CLI.

$ ./launchscript.sh viya01
$ ./launchscript.sh viya02
$ ./launchscript.sh viya03
...
$ ./launchscript.sh viya12
$ ./launchscript.sh viya13
$ ./launchscript.sh viya14

The script

The main parts of this launch script are:

  1. Testing if the Nordics Hackathon Repository VM is running because we must download software from our own locally created repository.
  2. Launch a new VM, based on the SAS Viya Image we created during preparation, assign a public static IP address, and choose a Standard_E32-16s_v3 Azure VM.
  3. Launch our own Viya-install script to perform the following three sub-steps:
    • Install openLDAP as the identity provider
    • Install SAS Viya just as you would do by following the SAS Viya Deployment Guide.
    • Install Jupyter with a customized Ansible script made by my colleague Alexander Koller.

The result of this is we have fourteen full SAS Viya installations ready in about one hour and 45 minutes. We recently posted a Linkedin video describing the entire process.

Final thoughts

I am planning to write a blog on SAS Communities to share more technical insight on how we created the script. I am honored I was asked to be part of the jury for the Hackathon. I am looking forward to the analytical insights that the different teams will discover and how they will make use of SAS Viya running on the Azure Cloud platform.

Additional resources

Series of Webinars supporting the Nordic Hackathon

Installing SAS Viya Azure virtual machines with a single click was published on SAS Users.

12月 222018
 

This post rounds out the year and my series of articles on SAS REST APIs. The first two articles in the series: Using SAS Viya REST APIs to access images from SAS Visual Analytics and Using SAS Cloud Analytics Service REST APIs to run CAS Actions, examined how to use SAS Viya REST and SAS CAS REST APIs to access SAS data from external resources. Access the links to for a quick detour to get some background. This article takes things a step further and outlines how to use a simple application to interact with SAS Viya using REST APIs.

What do chocolate and toffee have to do with optimization? Read on and find out.

The application

When deciding on an example to use in this article, I wanted to focus on the interaction between the application and SAS, not app complexity. I decided to use an application created by my colleague, Deva Kumar. His OptModel1 is an application built on the restAF framework and demonstrates how SAS REST APIs can be used to build applications that exploit various SAS Viya functionalities. This application optimizes the quantities of chocolate and toffee to purchase based on a budget entered by the user.

Think of the application as comparable to the guns and butter economic model. The idea in the model is the more you spend on the military (guns), the less you spend on domestic programs and the civilian goods (butter). As President Johnson stated in 1968, "That bitch of a war, killed the lady I really loved -- the Great Society." In this article, I'll stick to chocolate and toffee, a much less debatable (and tastier) subject matter.

The OptModel1 application uses the runOptmodel CAS action to solve the optimization problem. The application launches and authenticates the user, the app requests a budget. Based on the amount entered, a purchase recommendation returns for chocolate and toffee. The user may also request a report based on returned values. In the application, OptModel1 and SAS interact through REST API calls. Refer to the diagram below for application code workflow.

Create the application

To create the application yourself, access the source code and install instructions on SAS' github page. I recommend cloning, or in the least, accessing the repository. I refer to code snippets from multiple files throughout the article.

Application Workflow

Represented below is the OptModel1 work flow. Highlighted in yellow is each API call.

OptModel1 Work Flow

OptModel1 Work Flow

Outlined in the following sections is each step in the work flow, with corresponding numbers from the diagram.

Launch the application

Enter url http://localhost:5006/optmodel in a browser, to access the login screen.

OptModel1 app login page

1. Login

Enter proper credentials and click the 'Sign In' button. The OptModel1 application initiates authentication in the logon.html file with this code:

        <script>
            function logonButton() {
                let store = restaf.initStore();
                store.logon(LOGONPAYLOAD)
                    .then(msg => console.log(msg))
                    .catch(err => alert(err));
            }
        </script>

Application landing page

After successfully logging in, the application's main page appears.

Application landing page

Notice how the host and access token are part of the resulting url. For now, this is as far as I'll go on authentication. I will cover this topic in depth in a future article.

As I stated earlier, this is the simplest of applications. I want to keep the focus on what is going on under the covers and not on a flashy application.

2a. Application initialization

Once the app confirms authentication, the application initialization steps ensue. The app needs to be available to multiple users at once, so each session gets their own copy of the template Visual Analytics (VA) report. This avoids users stepping on each other’s changes. This is accomplished through a series of API calls as explained below. The code for these calls is in vaSetup.js and reportViewer.js.

2b. Copy data

The app copies data from the Public caslib to a temporary worklib – a worklib is a standard caslib like casuser. The casl code below is submitted to CAS server for execution. The code to make the API call to CAS is in vaSetup.js. The relevant snippet of javascript code is:

  // create casl statements
    let casl = `
        /* Drop the table in memory */
        action table.dropTable/
        caslib='${appEnv.work.caslib}' name='${appEnv.work.table}' quiet=TRUE;
 
        /* Delete the table from the source */
        action table.deletesource / 
        caslib='${appEnv.work.caslib}' source='${appEnv.work.table}.sashdat' quiet=TRUE;
 
        /* Run data step to copy the template table to worklib */
        action datastep.runCode /
            code='
            data ${appEnv.work.caslib}.${appEnv.work.table}; 
            set ${appEnv.template.caslib}.${appEnv.template.table};
            run;';
 
        /* Save the new work table */
        action table.save /
            caslib  = '${appEnv.work.caslib}'
            name    = '${appEnv.work.table}'
            replace = TRUE
            table= {
                caslib = '${appEnv.work.caslib}'
                name   = '${appEnv.work.table}'
            };
 
        /* Drop the table to force report to reload the new table */
        action table.dropTable/
            caslib='${appEnv.work.caslib}' name='${appEnv.work.table}' quiet=TRUE;
 
 
    `;
 
    // run casl statements on the server via REST API
    let payload = {
        action: 'sccasl.runCasl',
        data: {code: casl}
    }
    await store.runAction(session, payload);

2c. Does report exist?

This step checks to see if the personal copy of the VA report already exists.

2d. Delete temporary report

If the personal report exists it is deleted so that a new one can be created using the latest VA report template.

// If temporary report exists delete it - allows for potential new template report
    let reportsList = await getReport( store, reports, `${APPENV.work.report}`);
    if ( reportsList !== null ) {
        await store.apiCall(reportsList.itemsCmd(reportsList.itemsList(0), 'delete'));
      };

2e. Create new report

A new personal report is created. This new report is associated with the table that was created in step 2b.

// make the service call to create the temporary report
    let changeData = reportTransforms.links('createDataMappedReport');
    let newReport = await store.apiCall(changeData, p);

2f. Save report info

A new personal report is created. This new report is associated with the table that was created in step 2b.

// create src parameter for the iframe
    let options = "&appSwitcherDisabled=true&reportViewOnly=true&printEnabled=true&sharedEnabled=true&informationEnabled=true&commentEnabled=true&reportViewOnly=true";
    let href = `${appEnv.host}/SASReportViewer/?reportUri=${reportUri}${options}`;
 
    // save href in appEnv to use for displaying VA report in an iframe
    appEnv.href = href;

3. Enter budget

Enter budget in the space provided (I use $10,000 in this example) and click the Optimize button. This action instructs the application calculate the amount of chocolate and toffee to purchase based on the model.

Enter budget and optimize

4. & 5. Generate and execute CASL code

The code to load the CAS action set, run the CAS action, and store the results in a table, is in the genCode.js file:

  /* Assumption: All necessary input tables are in memory */
	pgm = "${pgm}";
	/*Load action set and run optimization*/
	loadactionset 'optimization';
		action optimization.runOptmodel / 
		code=pgm printlevel=0; 
		run; 
 
	/* save result of optimization for VA to use */
	action table.save /
		caslib  = '${appEnv.work.caslib}'
		name    = '${appEnv.work.table}'
		replace = TRUE
		table= {
			caslib = '${appEnv.work.caslib}'
			name   = '${appEnv.work.table}'
		};
 
	/* fetch results to return for the UI to display */
	action table.fetch r=result /
		table= {caslib = '${appEnv.work.caslib}' name = '${appEnv.work.table}'};
	run;
 
	/* drop the table to force report to reload the new table */
	action table.dropTable/
		caslib='${appEnv.work.caslib}' name='${appEnv.work.table}' quiet=TRUE;

Note: The drop table step at the end of the preceding code is important to force VA to reload the data for the report.

6. Get the results - table form

The results return to the application in table form. We now know to buy quantities of 370 chocolate and 111 toffee with our $10,000 budget. Please refer to the casTableViewer for code details of this step.

Data view in table format

6. Get the results - report form

Select the View Graph button. This action instructs OptModel1 to display the interactive report with the new data (the report we created in step 2f). Please refer to the onReport function in index.html for code details of this step.

Data view in report format

Now that we know how much chocolate and toffee to buy, we can make enough treats for all of the holiday parties just around the corner. More importantly, we see how to integrate SAS REST APIs into our application. This completes the series on using SAS REST APIs. The conversation is not over however. I will continue to search out and report on other topics related to SAS, open source languages, and agile technologies. Happy Holidays!

SAS REST APIs: a sample application was published on SAS Users.

12月 042018
 

When a Visual Analytics 8.3 report moves on a screen from one page to the next – all by itself, without a human hovering over a keyboard – you're seeing the Report Playback feature of SAS Visual Analytics Viewer 8.3 in action.

Reasons for using visual movement

Playable dashboards are easy to create and use. But let's ponder for a moment: Why would you want to set your report in motion? You might want it to scroll automatically:

  • At a kiosk or booth where folks linger for short periods of time.
  • During a presentation to an audience so you're hands-free. You decide how long each page displays and are free to focus on explaining key facts and figures in the moving report without the distraction of manually flipping through each page. Sort of like your car's cruise control –  you take your foot off the pedal and the vehicle keeps going.

Design considerations for playable dashboards

If the intent is to let the report run on its own in a kiosk or a booth, be mindful that such environments require information to move fast. Those watching the playable dashboard expect to grasp key facts and figures quickly. Time is of the essence.

A short attention span benefits from a report design whereby each report page contains one report object that quickly conveys the essence of the message in a few seconds. If you use a complex report design with multiple report objects and a small font, chances aren't good that your audience will absorb meaning from your report.

Any report object (for example, scatter plot) that requires your user to first look at the legend and then comprehend the data in the graph would be unsuitable for playable dashboards that are set to move at a fast rate, such as three or four seconds per page.

Example of a playable dashboard

I designed a report to illustrate carbon dioxide (CO2) emissions for 20 countries. I added five report objects that are easy to comprehend in about five seconds (a subjective estimate, of course.) I also added a scatter plot and geomap with legends that are challenging to comprehend to illustrate why report objects with legends can be unsuitable for a playable dashboard!

For the scatter plot, the presenter would have to expand the legend tooltip to show the legends for the country data in that report object – not realistic in a fast-paced dashboard. In the geomap, the audience needs to look at the legends at the bottom (icons, colors, etc.) and associate that legend with the display in the graph. That’s a lot of brain activity for five seconds – unrealistic. It makes sense, then, to use report objects here that don’t depend on user comprehension of legends to understand the data.

Let the show begin!

When the scatter plot or geomap is displayed, notice how it’s hard to comprehend such report objects in five seconds. In such a short timeframe, it's impossible to process legends and the data, all at once.

How to Create a Playable Dashboard in the Web-based Viewer

  1. In SAS Visual Analytics Viewer, I opened the report and chose Edit playback from the main menu.
  2. In the Edit Playback dialog, I chose the following options:

a. Transition unit – I can choose to display one page at a time or one object at a time. I chose to display one page at a time.

b. Seconds per unit – I chose to display each page for five seconds.

c. Show canvas only – I chose this option because it hides the report control area, page tabs, and page controls for a nicer look.

d. Show timer – This option would display a countdown for each page or object transition. I did not choose this option.

e. Show navigation controls for the report playback – I chose this option because it displays navigation controls in the bottom right corner of the viewer when I hover over the report with my mouse. Personally, I really like this feature because it gives me the flexibility to intervene and move the report pages forward or backward, pause the playback, or exit the playback.

Finally, I save and exit, and the playable dashboard begins to play on my monitor screen.

SAS® Visual Analytics on SAS® Viya® Try it for free!

How to create a playable dashboard with SAS Visual Analytics was published on SAS Users.

11月 062018
 

A few weeks ago I posted a cliffhanger-of-a-blog-post. I left my readers in suspense about which of my physical activities are represented in different sets of accelerometer data that I captured. In the absence of more details from me, the internet fan theories have been going wild. Well, it's time for the big reveal! I've created a SAS Visual Analytics report that shows each of these activity streams with the proper label:

Accelerometer measurements per activity -- click to enlarge!

Were your guesses confirmed? Any surprises? Were you more impressed with my safe driving or with my reckless behavior on the trampoline?

Collecting and preparing accelerometer data

You might remember that this entire experiment was inspired by a presentation from Analytics Experience 2018. That's when I learned about an insurance company that built a smartphone app to collect data about driving behavior, and that the app relies heavily on accelerometer readings. I didn't have time or expertise to build my own version of such an app, but I found that there are several good free apps that can collect and export this data. I used an app called AccDataRec on my Android phone.

Each "recording session" generates a TSV file -- a tab-separated file that contains a timestamp and a measurement for each of the accelerometer axes (X, Y, and Z). In my previous post, I shared tips about how to import multiple TSV files in a single step. Here's the final version of the program that I wrote to import these data:

filename tsvs "./accel/*.tsv";
libname out "./accel";
 
data out.accel;
  length 
    casefile $ 100 /* to write to data set */
    counter 8 
    timestamp 8 
    timestamp_sec 8
    x 8 y 8 z 8 
    filename $ 25        
    tsvfile $ 100 /* to hold the value */
  ;
  format timestamp datetime22.3 timestamp_sec datetime20.;
 
  /* store the name of the current infile */
  infile tsvs filename=tsvfile expandtabs;
  casefile=tsvfile;
  input counter timestamp x y z filename;
 
  /* convert epoch time into SAS time */
  timestamp=dhms('01jan1970'd, 0, 0, timestamp / 1000);
 
  /* create a timestamp with the precision of one second */
  timestamp_sec = intnx('second',timestamp,0);
run;

Some notes:

  • I converted the timestamp value from the data file (an epoch time value) to a native SAS datetime value by using this trick.
  • Following advice from readers on my last post, I changed the DLM= option to a more simple EXPANDTABS option on the INFILE statement.
  • Some of the SAS time-series analysis doesn't like the more-precise timestamp values with fractions of seconds. I computed a less precise field, rounding down to the second, just in case.
  • For my reports in this post, I really need only 5 fields: counter (the ordinal sequence of measurements), x, y, z, and the filename (mapping to activity).

The new integrated SAS Viya environment makes it simple to move from one task to another, without needing to understand the SAS product boundaries. I used the Manage Data function (that's SAS Data Management, but does that matter?) to upload the ACCEL data set and make it available for use in my reports. Here's a preview:

Creating a SAS Visual Analytics report

With the data now available and loaded into memory, I jumped to the Explore and Visualize Data activity. This is where I can use my data to create a new SAS Visual Analytics report.

At first, I was tempted to create a Time Series Plot. My data does contain time values, and I want to examine the progression of my measurements over time. However, I found the options of the Time Series Plot to be too constraining for my task, and it turns out that for this task the actual time values really aren't that important. What's important is the sequence of the measurements I've collected, and that's captured as an ordinal in the counter value. So, I selected the Line Plot instead. This allowed for more options in the categorical views -- including a lattice row arrangement that made it easy to see the different activity patterns at a glance. This screen capture shows the Role assignments that I selected for the plot.

Adding a closer view at each activity

With the overview Line Plot complete, it's time to add another view that allows us to see just a single activity and provide a close-up view of its pattern. I added a second page to my report and dropped another Line Plot onto the canvas. I assigned "counter" to the category and the x, y, and z values to the Measures. But instead of adding a Lattice Row value, I added a Button Bar to the top of the canvas. My idea is to use the Button Bar -- which is good for navigating among a small number of values -- as a way to trigger a filter for the accelerometer data.

I assigned "filename" to the Category value in the Button Bar role pane. Then I used the Button Bar options menu (the vertical dots on the right) to add a New filter from selection, selecting "Include only selection".

With this Button Bar control and its filter in place, I can now switch among the data values for the different activities. Here's my "drive home" data -- it looks sort of exciting, but I can promise you that it was a nice, boring ride home through typical Raleigh traffic.

Phone mounted in my car for the drive home

The readings from the "kitchen table" activity surprised me at first. This activity was simply 5 minutes of my phone lying flat on my kitchen table. I expected all readings to hover around zero, but the z axis showed a relatively flat line closer to 10 meters-per-second-per-second. Then I remembered: gravity. This sensor registers Earth's gravity, which we are taught is 9.8 meters-per-second-per-second. The readings from my phone hovered around 9.6 -- maybe my house is in a special low-gravity zone, or the readings are a bit off.

Phone at rest on my kitchen table

Finally, let's take a closer look at my trampoline workout. Since I was holding my phone upright, it looks like the x-axis felt the brunt of the acceleration forces. According to these readings, my phone was subjected to a g-force of 7 or 8 times that of Earth's gravity -- but just for a split second. And since my phone was in my hand and my arm was flailing around (I am not a graceful rebounder), my phone was probably experiencing more force than my body was.

Bounding on the trampoline as high as I can

Some love for the Windows 10 app

My favorite method to view SAS Visual Analytics reports is through the SAS Visual Analytics application that's available for Windows 10 and Windows mobile devices. Even on my desktop, where I have a full web browser to help me, I like the look and feel of the specialized Windows 10 app. The report screen captures for this article were rendered in the Windows 10 app. Check out this article for more information about the app. You can try the app for free, even without your own SAS Viya environment. The app is hardwired with a connection to the SAS demo reports at SAS.com.

See also

This is the third (and probably final) article in my series about accelerometer data. See these previous posts for more of the fun background information:

The post Reporting on accelerometer data with SAS Visual Analytics appeared first on The SAS Dummy.

10月 312018
 

This article is the first in a series of three posts to address REST APIs and their use in, and with, SAS. Today, I'll present a basic example using SAS Viya REST APIs to download an image from a report in SAS Visual Analytics.

The second article will show an example of the Cloud Analytics Services (CAS) REST APIs. My third planned article will outline show a simple application that accesses SAS Viya using both sets of REST APIs.

The inspiration for this example: a visualization of air traffic data

I ran across a great post from Mike Drutar: How to create animated line charts that "grow" in SAS Visual Analytics. I followed the steps in Mike's example, which creates a visualization of airline traffic. The result was an animated line chart. For this article, I removed the animation, as it will serve me better in my use case.

SAS Viya APIs and CAS APIs: Two entry points into SAS Viya

The first thing I'd like to cover is why SAS Viya offers two sets of REST APIs. Let's consider who is using the APIs, and what they are trying to accomplish? SAS Viya APIs target enterprise application developers (who may or may not be versed in analytics), who intend to build on the work of model builders and data scientists. These developers want to deliver apps based on SAS Viya technology -- for example, to call an analytical model to score data. On the other hand, the CAS REST API is used by data scientists and programmers (who are decidedly adept at analytics) and administrators, who need to interact with CAS directly and are knowledgeable about CAS actions. CAS actions are the building blocks of analytical work in SAS Viya.

How to get started with SAS Viya REST APIs

The best place to start working with SAS Viya REST APIs is on the SAS Developer's web site. There, you will find links to the API documentation.

The REST APIs are written to make it easy to integrate the capabilities of SAS Viya to help build applications or create scripts. The APIs are based on URLs, using HTTP Authentication, and HTTP verbs. The API documentation page is split into multiple categories. The following table outlines the breakdown:

API Category Description
Visualization Provide access to reports and report images
Compute Act on SAS compute and analytic servers, including Cloud Analytic Services (CAS)
Text Analytics Provide analysis and categorization of text documents
Data Management Enable data manipulation and data quality operations on data sources
Decision Management Provide access to machine scoring and business rules
Core Services Provide operations for shared resources such as files and folders

 

The REST API documentation page is divided into multiple sections.

SAS Viya REST API doc

  1. The categories are listed in the upper-left side.
  2. Once a you select a category, related services and functions are listed in the lower left pane.
  3. The service appears in the center pane with a description, parameters, responses, and error codes.
  4. The right pane displays how to form a sample request, any optional or required body text, and sample response code.

The REST API call process

The example outlined in this article shows how to access a report image from SAS Visual Analytics. To try this out yourself, you will need: a SAS Viya environment (with SAS Visual Analytics configured), an access token, and a REST client. The REST client can be cURL (command line), Postman (a popular REST API environment), or Atom with the rest-client plugin -- or any other scripting language of your choice. Even if you do not have access to an environment right now, read on! As a SAS developer, you're going to want to be aware of these capabilities.

Get a list of reports from SAS Visual Analytics

Run the following curl command to get a list of reports on the SAS server:

curl -X GET http://sasserver.demo.sas.com/reports/reports\
  -H 'Authorization: Bearer &lt;access-token-goes-here&gt;' \
  -H 'Accept: application/vnd.sas.table.column+json'

Alternatively, use Postman to enter the command and parameters:

GET Report List API call from Postman

From the JSON response, find the report object and grab the id of the desired report:

GET Report List Response

Create a job

The next step is to create an asynchronous job to generate the SVG image from the report. I use the following HTTP POST with the /jobs verb:

curl -X POST <a href="http://sasserver.demo.sas.com/reportImages/jobs/">http://sasserver.demo.sas.com/reportImages/jobs\
  -H 'Authorization: Bearer &lt;access-token-goes-here&gt;' \
  -H 'Accept = application/vnd.sas.report.images.job+json'\
  -H 'Content-Type = application/vnd.sas.report.images.job.request+json'

Using the following sample Body text

{
  "reportUri" : "/reports/reports/b555ea27-f204-4d67-9c74-885311220d45",
  "layoutType" : "entireSection",
  "selectionType" : "report",
  "size" : "400x300",
  "version" : 1
}

Here is the sample response:

POST Job Creation Response

The job creation kicks off an asynchronous action. The response indicates whether the job is completed at response time, or whether it's still pending. As you can see from the above response, our job is still in a 'running' state. The next step is to poll the server for job completion.

Poll for job completion

Using the 'id' value from the job creation POST, the command to poll is:

curl -X GET http://sasserver.demo.sas.com/reportImages/jobs/f7a12533-ac40-4acd-acda-e0c902c6c2c1\
  -H 'Authorization: Bearer ' \ 
  -H ‘Accept = application/vnd.sas.report.images.job+json’

And the response:

GET Poll Job Creation Response

Once the job comes back with a 'completed' state, the response will contain the information we need to fetch the report image.

Get the image

I am now ready to get the image. Using the image file name (href field) from the response above, I run the following command:

curl -X GET http://sasserver.demo.sas.com/reportImages/images/K1870020424B498241567.svg\
  -H 'Authorization: Bearer ' \ 
  -H ‘'Accept: image/svg+xml'

Postman automatically interprets the response as as an image. If you use the curl command, you'll need to redirect the output to a file.

SAS Visual Analytics Graph for Air Traffic

What's Next?

SAS Visual Analytics is usually considered an interactive, point-and-click application. With these REST APIs we can automate parts of SAS Visual Analytics from a web application, a service, or a script. This opens tremendous opportunities for us to extend SAS Visual Analytics report content outside the bounds of the SAS Visual Analytics app.

I'll cover more in my next articles. In the meantime, check out the Visualization APIs documentation to see what's possible. Have questions? Post in the comments and I'll try to address in future posts.

Using SAS Viya REST APIs to access images from SAS Visual Analytics was published on SAS Users.

10月 312018
 

This article is the first in a series of three posts to address REST APIs and their use in, and with, SAS. Today, I'll present a basic example using SAS Viya REST APIs to download an image from a report in SAS Visual Analytics.

The second article will show an example of the Cloud Analytics Services (CAS) REST APIs. My third planned article will outline show a simple application that accesses SAS Viya using both sets of REST APIs.

The inspiration for this example: a visualization of air traffic data

I ran across a great post from Mike Drutar: How to create animated line charts that "grow" in SAS Visual Analytics. I followed the steps in Mike's example, which creates a visualization of airline traffic. The result was an animated line chart. For this article, I removed the animation, as it will serve me better in my use case.

SAS Viya APIs and CAS APIs: Two entry points into SAS Viya

The first thing I'd like to cover is why SAS Viya offers two sets of REST APIs. Let's consider who is using the APIs, and what they are trying to accomplish? SAS Viya APIs target enterprise application developers (who may or may not be versed in analytics), who intend to build on the work of model builders and data scientists. These developers want to deliver apps based on SAS Viya technology -- for example, to call an analytical model to score data. On the other hand, the CAS REST API is used by data scientists and programmers (who are decidedly adept at analytics) and administrators, who need to interact with CAS directly and are knowledgeable about CAS actions. CAS actions are the building blocks of analytical work in SAS Viya.

How to get started with SAS Viya REST APIs

The best place to start working with SAS Viya REST APIs is on the SAS Developer's web site. There, you will find links to the API documentation.

The REST APIs are written to make it easy to integrate the capabilities of SAS Viya to help build applications or create scripts. The APIs are based on URLs, using HTTP Authentication, and HTTP verbs. The API documentation page is split into multiple categories. The following table outlines the breakdown:

API Category Description
Visualization Provide access to reports and report images
Compute Act on SAS compute and analytic servers, including Cloud Analytic Services (CAS)
Text Analytics Provide analysis and categorization of text documents
Data Management Enable data manipulation and data quality operations on data sources
Decision Management Provide access to machine scoring and business rules
Core Services Provide operations for shared resources such as files and folders

 

The REST API documentation page is divided into multiple sections.

SAS Viya REST API doc

  1. The categories are listed in the upper-left side.
  2. Once a you select a category, related services and functions are listed in the lower left pane.
  3. The service appears in the center pane with a description, parameters, responses, and error codes.
  4. The right pane displays how to form a sample request, any optional or required body text, and sample response code.

The REST API call process

The example outlined in this article shows how to access a report image from SAS Visual Analytics. To try this out yourself, you will need: a SAS Viya environment (with SAS Visual Analytics configured), an access token, and a REST client. The REST client can be cURL (command line), Postman (a popular REST API environment), or Atom with the rest-client plugin -- or any other scripting language of your choice. Even if you do not have access to an environment right now, read on! As a SAS developer, you're going to want to be aware of these capabilities.

Get a list of reports from SAS Visual Analytics

Run the following curl command to get a list of reports on the SAS server:

curl -X GET http://sasserver.demo.sas.com/reports/reports\
  -H 'Authorization: Bearer &lt;access-token-goes-here&gt;' \
  -H 'Accept: application/vnd.sas.table.column+json'

Alternatively, use Postman to enter the command and parameters:

GET Report List API call from Postman

From the JSON response, find the report object and grab the id of the desired report:

GET Report List Response

Create a job

The next step is to create an asynchronous job to generate the SVG image from the report. I use the following HTTP POST with the /jobs verb:

curl -X POST <a href="http://sasserver.demo.sas.com/reportImages/jobs/">http://sasserver.demo.sas.com/reportImages/jobs\
  -H 'Authorization: Bearer &lt;access-token-goes-here&gt;' \
  -H 'Accept = application/vnd.sas.report.images.job+json'\
  -H 'Content-Type = application/vnd.sas.report.images.job.request+json'

Using the following sample Body text

{
  "reportUri" : "/reports/reports/b555ea27-f204-4d67-9c74-885311220d45",
  "layoutType" : "entireSection",
  "selectionType" : "report",
  "size" : "400x300",
  "version" : 1
}

Here is the sample response:

POST Job Creation Response

The job creation kicks off an asynchronous action. The response indicates whether the job is completed at response time, or whether it's still pending. As you can see from the above response, our job is still in a 'running' state. The next step is to poll the server for job completion.

Poll for job completion

Using the 'id' value from the job creation POST, the command to poll is:

curl -X GET http://sasserver.demo.sas.com/reportImages/jobs/f7a12533-ac40-4acd-acda-e0c902c6c2c1\
  -H 'Authorization: Bearer ' \ 
  -H ‘Accept = application/vnd.sas.report.images.job+json’

And the response:

GET Poll Job Creation Response

Once the job comes back with a 'completed' state, the response will contain the information we need to fetch the report image.

Get the image

I am now ready to get the image. Using the image file name (href field) from the response above, I run the following command:

curl -X GET http://sasserver.demo.sas.com/reportImages/images/K1870020424B498241567.svg\
  -H 'Authorization: Bearer ' \ 
  -H ‘'Accept: image/svg+xml'

Postman automatically interprets the response as as an image. If you use the curl command, you'll need to redirect the output to a file.

SAS Visual Analytics Graph for Air Traffic

What's Next?

SAS Visual Analytics is usually considered an interactive, point-and-click application. With these REST APIs we can automate parts of SAS Visual Analytics from a web application, a service, or a script. This opens tremendous opportunities for us to extend SAS Visual Analytics report content outside the bounds of the SAS Visual Analytics app.

I'll cover more in my next articles. In the meantime, check out the Visualization APIs documentation to see what's possible. Have questions? Post in the comments and I'll try to address in future posts.

Using SAS Viya REST APIs to access images from SAS Visual Analytics was published on SAS Users.

10月 082018
 

When was the last time you or your colleagues wanted access to data and tools to produce reports and dashboards for a business need? Probably within the last hour. Self-service BI applications – gaining popularity as we speak – make gaining insights and decision making faster. But they've also generated a greater need for governance.

Part of governance is understanding the data lifecycle or data lineage. For example, a co-worker performed some modifications to a dataset and used it to produce a report that you would like to use to help solve a business need.  How can you be sure that the information in this report is accurate? How did the producer of the report calculate certain measures? From what original data set was the report based?

SAS provides many tools to help govern platforms and solutions.  Let’s look at one of those tools to understand the data lifecycle: SAS Lineage Viewer.

Here we have a report created to explore and visualize telecommunications data using SAS Visual Analytics.  The report shows our variable of interest, cross-sell and up-sell flag, and its relationship to other variables. This report will be used to target customers for cross-sell or up-sell.

This report is based on an Analytical Base Table (ABT) that was created by joining two data sets:

  1. Usage information from a subset of customers who have contacted customer care centers.
  2. Cleansed demographics data.

The name of the joined dataset the report is based on is LG_FINAL_ABT.

To make sure we understand the data behind this report we’ll explore it using a lineage viewer (you will need to be assigned to the "Data Management Business User” or “Data Management: Lineage” group, which an administrator can help you with).  From the applications menu, select Explore Lineage.

 

We’ll click on Search for Subjects and search for the report we were just reviewing: Telecommunications.

I’ll enter Telecommunications in the search field then select the Telecommunications report.

The first thing I see is the LG_Final_ABT CAS table my report is dependent on.

If I click on the + sign on the top right corner of the data, LG_Final_ABT, I can see all the other relationships to that CAS table.  There is a Model Studio project, two Visual Analytics Reports (including the report we looked at), and a data view that are all dependent on the LG_FINAL_ABT CAS Table.  This diagram also shows us that the LG_FINAL_ABT CAS table is dependent on the Public CAS library.  We also see that the LG_FINAL_ABT CAS table was loaded into CAS from the LG_FINAL_ABT.sashdat file.

Let’s explore the LG_FINAL_ABT.sashdat file to see its lineage. Clicking on the + expands the view. In the following diagram, I expanded all the remaining items to see the full data lifecycle.

This image shows us the whole data life cycle.  From LG_FINAL_ABT.sashadat we see that it is dependent on the Create Final LG ABT data preparation plan.  That plan is dependent on two CAS tables; LG_CUSTOMER and LG_ORIG_ABT.  The data lineage viewer shows us that the LG_CUSTOMER table was loaded in from a csv file (lg_customer.csv) and the LG_ORIG_ABT CAS file was loaded in from a sas data set (lg_orig_abt.sas7dbat).

To dive deeper into the mashups and data manipulations that took place to produce LG_FINAL_ABT.sashdat, we can open the data preparation plan.  To do this I’ll right click on Create Final LG ABT and select Actions then Prepare Data.

Here is the data preparation plan.  At the top you can see that the creator of this data set performed five steps – Gender Analysis, Standardize, Remove, Rename and Join.

To get details into each of these steps, click on the titles at the top.  Clicking on Gender Analysis, I see that a gender analysis was performed based on the customer_name field and the results were added to the data set in a variable named customer_name_GND.

Clicking on the Standardize title, I see that there were two standardization tasks performed on the original data set. One for customer state and the other for customer phone number.  I can also see that the results were placed in new fields (customer_state_STND and customer_primary_phone_STND).

Clicking on the Remove title, I see that three variables were dropped from the final dataset.  These variables were the original ones that the user had “fixed” in the previous steps: customer_gender, customer_state, and customer_primary_phone.

Clicking on the Rename title, I see that the new variables have been renamed.

The last step in the process is a join. Clicking on the Join title I see that LG_CUSTOMER was joined with LG_ORIG_ABT based on an inner join on Customer_ID.

We have just walked through the data lineage or data lifecycle for the dataset LG_FINAL_ABT, using SAS tools. I now understand how the data in the report we were looking at was generated. I am confident that the information that I gain from the report will be accurate.

Since sharing information and data among co-workers has become so common, it's now more crucial than ever to think about the data lifecycle. When you gain access to a report that you did not create it is always a good idea to check the underlying data to ensure that you understand the data and that any business insights gained are accurate. Also, if you are sharing data with others and you want to make modifications to it, you should always check the lineage to ensure that you won’t be undermining someone else’s work with changes you make. Thanks to SAS Visual Analytics, all the necessary tools one needs to review data lineage are all available within one interface.

Keep track of where data originated with data lineage in SAS was published on SAS Users.

10月 052018
 

In my earlier blog, I described how to create maps in SAS Visual Analytics 8.2 if you have an ESRI shapefile with  granular geographies, such as counties, that you wish to combine into regions. Since posting this blog in January 2018, I received a lot of questions from users on a range of mapping topics, so I thought a more general post on using – and troubleshooting - custom polygons in SAS Visual Analytics on Viya was in order. Since version 8.3 is now generally available, this post is tailored to the 8.3 version of SAS Visual Analytics, but the custom polygon functionality hasn’t really changed between the 8.2 and 8.3 releases.

What are custom polygons?

Custom polygons are geographic boundaries that enable you to visualize data as shaded areas on the map. They are also sometimes referred to as a choropleth maps. For example, you work for a non-profit organization which is trying to decide where to put a new senior center. So you create a map that shows the population of people over 65 years of age by US census tract. The darker polygons suggest a larger number of seniors, and thus a potentially better location to build a senior center:

SAS Visual Analytics 8.3 includes a few predefined polygonal shapes, including countries and states/provinces. But if you need something more granular, you can upload your own polygonal shapes.

How do I create my own polygonal shapes?

To create a polygonal map, you need two components:

  1. A dataset with a measure variable and a region ID variable. For example, you may have population as a measure, and census tract ID as a region ID. A simple frequency can be used as a measure, too.
  2. A “polygon provider” dataset, which contains the same region ID as above, plus geographic coordinates of each vertex in each polygon, a segment ID and a sequence number.

So where do I get this mysterious polygon provider? Typically, you will need to search for a shapefile that contains the polygons you need, and do a little bit of data preparation. Shapefile is a geographic data format supported by ESRI. When you download a shapefile and look at it on the file system, you will see that it contains several files. For example, my 2010 Census Tract shapefile includes all these components:

Sometimes you may see other components present as well. Make sure to keep all components together.

To prepare this data for SAS Visual Analytics, you have two options.

Preparing shapefile for SAS Visual Analytics: The long way

One method to prepare the polygon provider is to run PROC MAPIMPORT to convert the shapefile into a SAS dataset, add a sequence ID field and then load into the Cloud Analytic Services (CAS) server in SAS Viya. The sequence ID is mandatory, as it helps SAS Visual Analytics to draw the lines connecting vertices in the correct order.

A colleague recently reached out for help with a map of Census block groups for Chatham County in North Carolina. Let’s look at his example:

The shapefile was downloaded from here. We then ran the following code on my desktop:

libname geo 'C:\...\Data;
 
proc mapimport datafile="C:\...\Data\Chatham_County__2010_Census_Block_Groups.shp"
out=work.chatham_cbg;
run;
 
data geo.chatham_cbg;
set  chatham_cbg;
seqno=_n_;
run;

We then manually loaded the geo.chatham_cbg dataset in CAS using self-service import in SAS Visual Analytics. If you are not sure how to upload a dataset to CAS, please check the %SHIMPR. The macro will automatically run PROC MAPIMPORT, create a sequence ID variable and load the table into CAS. Here’s an example:

%shpimprt(shapefilepath=/path/Chatham_County__2010_Census_Block_Groups.shp, id=GEOID, outtable=Chatham_CBG, cashost=my_viya_host.com,   casport=5570, caslib='Public');

For this macro to work, the shapefile must be copied to a location that your SAS Viya server can access, and the code needs to be executed in an environment that has SAS Viya installed. So, it wouldn’t work if I tried to run it on my desktop, which only has SAS 9.4 installed. But it works beautifully if I run it in SAS Studio on my SAS Viya machine.

Configuring the polygon provider

The next step is to configure the polygon provider inside your report. I provided a detailed description of this in my earlier blog, so here I’ll just summarize the steps:

  • Add your data to the SAS Visual Analytics report, locate the region ID variable, right-click and select New Geography
  • Give it a name and select Custom Polygonal Shapes as geography type
  • Click on the Custom Polygon Provider box and select Define New Polygon Provider
  • Configure your polygon provider by selecting the library, table and ID column. The values in your ID column must match the values of the region ID variable in the dataset you are visualizing. The ID column, however, does not need to have the same name as in the visualization dataset.
  • If necessary, configure advanced options of the polygon provider (more on that in the troubleshooting section of this blog).

If all goes well, you should see a preview of your polygons and a percentage of regions mapped. Click OK to save your geographic item, and feel free to use it in the Geo Map object.

I followed your instructions, but the map is not working. What am I missing?

I observed a few common troubleshooting issues with custom maps, and all are fairly easy to fix. The table below summarizes symptoms and solutions.
 

Symptom Solution
In the Geographic Item preview, 0% of the regions are mapped. For example:
Check that the values in the region ID variable match between the main dataset and the polygon provider dataset.
I successfully created the map, but the colors of the polygons all look the same. I know I have a range of values, but the map doesn’t convey the differences. In your main dataset, you probably have missing region ID values or region IDs that don’t exist in the polygon provider dataset. Add a filter to your Geo Map object to exclude region IDs that can’t be mapped.

Only a subset of regions is rendered. You may have too many points (vertices) in your polygon provider dataset. SAS Visual Analytics can render up to 250,000 points. If you have a large number of polygons represented in a detailed way, you can easily exceed this limit. You have two options, which you can mix and match:

(1)    Filter the map to show fewer polygons

(2)    Reduce the level of detail in the polygon provider dataset using PROC GREDUCE. See example here. Also, if you imported data using the %shpimprt macro, it has an option to reduce the dataset. Here’s a handy link to In the Geographic Item preview, the note shows that 100% of the regions are mapped, but the regions don’t render, or the regions are rendered in the wrong location (e.g., in the middle of the ocean) and/or at an incorrect scale.

This is probably the trickiest issue, and the most likely culprit is an incorrectly specified coordinate space code (EPSG code). The EPSG code corresponds to the type of projection applied to the latitude and longitude in the polygon provider dataset (and the originating shapefile). Projection is a method of displaying points from a sphere (the Earth) on a two-dimensional plane (flat surface). See this tutorial if you want to know more about projections.

There are several projection types and numerous flavors of each type. The default EPSG code used in SAS Visual Analytics is EPSG:4326, which corresponds to the unprojected coordinate system.  If you open advanced properties of your polygon provider, you can see the current EPSG code:

Finding the correct EPSG code can be tricky, as not all shapefiles have consistent and reliable metadata built in. Here are a couple of things you can try:

(1)    Open your shapefile as a layer in a mapping application such as ArcMap (licensed by ESRI) or QGIS (open source) and view the properties of the layer. In many cases the EPSG code will appear in the properties.

(2)    Go to the location of your shapefile and open the .prj file in Notepad. It will show the projection information for your shapefile, although it may look a bit cryptic. Take note of the unit of measure (e.g., feet), datum (e.g., NAD 83) and projection type (e.g., Lambert Conformal Conic). Then, go to https://epsg.io/ and search for your geography.  Going back to the example for Chatham county, I searched for North Carolina. If more than one code is listed, select a few codes that seem to match your .prj information the best, then go back to SAS Visual Analytics and change the polygon provider Coordinate Space property. You may have to try a few codes before you find the one that works best.

I ruled out a projection issue, the note in Geographic Item preview shows that 100% of the regions are mapped, but the regions still don’t render. Take a look at your polygon provider preparation code and double-check that the order of observations didn’t accidentally get changed. The order of records may change, for example, if you use a PROC SQL join when you prepare the dataset. If you accidentally changed the order of the records prior to assigning the sequence ID, it can result in an illogical order of points which SAS Visual Analytics will have trouble rendering. Remember, sequence ID is needed so that SAS Visual Analytics can render the outlines of each polygon correctly.

You can validate the order of records by mapping the polygon provider using PROC GMAP, for example:

proc gmap map=geo.chatham_cbg data=geo.chatham_cbg;
   id geoid;
   choro geoid / nolegend levels=1;
run;

For example, in image #1 below, the records are ordered correctly. In image #2, the order or records is clearly wrong, hence the lines going crisscross.

 

As you can see, custom regional maps in SAS Visual Analytics 8.3 are pretty straightforward to implement. The few "gotchas" I described will help you troubleshoot some of the common issues you may encounter.

P.S. I would like to thank Falko Schulz for his help in reviewing this blog.

Troubleshooting custom polygon maps in SAS Visual Analytics 8.3 was published on SAS Users.

10月 042018
 

You can now conduct a live demo of SAS Visual Analytics on your mobile device to participants who are geographically dispersed by using the Present Screen feature, a tucked-away option in the SAS Visual Analytics app for iPad, iPhone, and Android devices. Let’s say I am looking at a report on my mobile device, and I have questions about a couple of items for my colleagues Joe and Anita, both of whom are located in two different cities. The three of us are able to see the report while I demo it, drawing their attention to specific areas of interest in the report.

Sitting in my office, or from any location where I have a Wi-Fi or cellular connection, I can use the Present Screen feature to do a live shared presentation with Joe and Anita. And I don't have to present just one report. During the live presentation, I can close a report, open a different one, perform interactions, and move around in the app between different reports.

And here’s the real beauty of this feature. Neither Joe nor Anita need to have the SAS Visual Analytics app on a mobile device, or SAS Visual Analytics running on a desktop. The only requirement for participants is that they have a mobile device (could be an iOS, Android, or Windows device), a laptop, or a desktop system.  Internet access via Wi-Fi (or a cellular connection for mobile devices), an email client for receiving an email notification, and a Web browser are necessary. VPN connectivity might be required if the participants' organization requires VPN.

Plus, you can conduct your live presentation for up to 10 people! Before we move on, note that you can also use the iOS feature, AirDrop, on your iOS device to engage participants with the Present Screen feature. This is useful if you’re in a room with a bunch of folks who have iOS devices, and you want to do live sharing.

Ready to try it? Here’s a short checklist of what you need to do a live presentation of SAS Visual Analytics reports.

Requirements for the presenter

  • Use any one of these devices to present your screen for a shared presentation to your participants: iPad, iPhone, Android tablet or smartphone
  • SAS Visual Analytics app installed on your device
  • Connection via Wi-Fi or cellular to the server where the report(s) reside
  • Subscription to the reports that you want to present and share
  • Email client on the iPad or phone
  • VPN connectivity if your organization requires it

Couple of things to note

Say you're presenting to 10 participants via email or AirDrop.  Note that as the presenter, you must have the SAS Visual Analytics app open in your device with the Present Screen feature selected and active for your guests to be able to see your reports. Once you exit the app, the screen presentation session ends and the email or AirDrop invitations are no longer valid.

And a note on participant information. After a participant forwards an email invitation to view the presentation to a colleague, the recipient is required to enter a name and email address before joining your live screen presentation. You, as the presenter, get to see the names of the folks that have joined your screen presentation. Participant names and email addresses simply let the presenter know who all have joined to presentation. Neither the participants' names nor their email addresses are validated by the SAS Visual Analytics server.

Supported versions for SAS Visual Analytics reports

You can present and share reports with the Present Screen feature for these versions of SAS Visual Analytics:

7.3, 7.4, 8.1, 8.2, 8.3

How to start the screen presentation

In the Analytics report I'm subscribed to from the SAS Visual Analytics app on my iPad, I choose Present Screen.

Choosing Present Screen in the subscribed report

I am reminded that I can present my screen to a maximum of 10 participants. I click OK.

The app prompts me to send email or choose AirDrop to present my screen to participants - I choose email.

My email client opens, and the app includes instructions along with the link that takes them to my live screen presentation. I enter the email addresses for my participants and send the email.

In the email that Anita receives on her desktop PC, she taps on the link which takes her to her Web browser where my screen presentation is set to start soon.

Anita enters her name and email address, and notes that the presentation has not yet started.

Joe, who has logged on to the presentation from his Android phone, is also presented with the same message on his smartphone.

To begin the screen presentation, I tap on the blinking cursor in the app.

The app reminds me that now my participants can see everything on my iPad screen.

Next, a blue bar at the top of the report indicates that my screen presentation is live and can be seen by Anita and Joe. Now I can begin my report presentation, or exit this report and open a different report to share.

Here is my screen on the iPad Mini where I started my screen presentation:

Anita's screen on her Windows desktop monitor:

Joe's screen on the Android smartphone:

When I am finished with the presentation, I tap on the 'stop' button to end it.

A message displays to indicate that the presentation has ended. Here's an example of that message from Joe's Android smartphone.

Do it live! How to present your screen from the SAS Visual Analytics app was published on SAS Users.