Tech

4月 242019
 
The ODS Excel destination, which became a production feature in SAS 9.4M3 (TS1M3), generates Microsoft Excel workbooks in native XLSX format. This destination generates multiple worksheets per workbook with each output object (e.g., a table or graphic) the destination encounters by default. The ODS Excel destination is also flexible, enabling you to vertically control the worksheet and place output objects wherever you want. This blog demonstrates the destination’s flexibility and how you can modify its default behavior by using the ODS EXCEL statement's SHEET_INTERVAL= option.

Adding tables and graphics on the same Microsoft Excel worksheet

By default, the ODS Excel destination adds a new worksheet for each table and graphic. However, at times, you might not want to use this default behavior. If you want more control over this, the SHEET_INTERVAL= ODS Excel option determines when a new worksheet is created. Valid values for the SHEET_INTERVAL= option include:

  • TABLE (the default value) - new sheet for each table in output
  • NONE - keep the output that follows on the current sheet
  • PAGE - new sheet for each page of SAS output
  • PROC - new sheet beginning at the PROC step boundary
  • BYGROUP - new sheet for each BY group of output
  • NOW - begin a new sheet immediately

The value NOW, new for SAS 9.4M5 (TS1M5), triggers the creation of a new worksheet after the destination encounters the next output object.
As an example of opting not to use the default behavior, consider a case where you have a CONTENTS procedure without any options. This procedure generates three separate worksheets with the data-set attributes, the engine host information, and the variable list.

The following table shows the default output that you receive with three individual worksheets:

However, if you want to place all three objects on a single worksheet, you can do that by setting the option SHEET_INTERVAL="NONE". The option setting SHEET_INTERVAL="PROC" could also be used in this example which would create a new worksheet only when a new procedure is encountered.
The following example illustrates how to use this option to include all your output on the same worksheet:

ods excel file="c:\test.xlsx" options sheet_interval="none");
proc contents data=sashelp.class;
run; 
ods excel close;

Output

Adding text and tables to a new worksheet

Two of the most popular ways to add text on worksheets are to use either the ODS TEXT= statement or the ODSTEXT procedure with the Excel destination. The following example adds text to a worksheet by using the ODS TEXT= statement. You include this statement before each PRINT procedure in this example:

ods excel file="c:\temp\test.xlsx";
ods excel options(sheet_name="Females");
 
ods text="Data for Female Patients";
proc print data=sashelp.class(where=(sex="F"));
run;
 
ods excel options(sheet_name="Males");
ods text="Data for Male Patients";
 
proc print data=sashelp.class(where=(sex="M"));
run;
 
ods excel close;

Output

Notice that the first text string appears at the top of the first worksheet as expected. However, the text from the second ODS TEXT= statement appears at the bottom of this same page rather than at the top of the next worksheet containing the related data table. This behavior illustrates that the ODS TEXT= option is not very flexible. There is no good way to solve this issue.

However, you can use the SAS 9.4 ODSTEXT procedure in combination with the SHEET_INTERVAL= option to move the text string to the appropriate worksheet.
The following example uses PROC ODSTEXT and the SHEET_INTERVAL= option to move the text string "Statistics for Male Patients" to the top of the second worksheet:

ods excel file="c:\temp\test.xlsx";
ods excel options(sheet_name="Females");
 
ods text="Data for Female Patients";
proc print data=sashelp.class(where=(sex="F"));
run;
 
ods excel options(sheet_name="Males" sheet_interval="now");
ods excel options(sheet_name="Males" sheet_interval="none");
 
proc odstext;
   p "Data for Male Patients";
run;
 
proc print data=sashelp.class(where=(sex="M"));
run;
 
ods excel close;

Output

Adding multiple tables or graphs on the same worksheet

This final example demonstrates how you can use the SHEET_INTERVAL= option to add multiple tables and graphics to the same Excel worksheet. First, we use the SHEET_INTERVAL="NONE" option in the first ODS EXCEL statement to place the first table and graph on the same worksheet. Then, the SHEET_INTERVAL="NOW" option is included in the second ODS EXCEL statement option to create a second worksheet and write the second table and graph to that worksheet:

ods graphics / height=2.5in width=3.5in;
ods excel file="c:\scratch\test.xlsx" options(sheet_interval="none");
 
proc print data=sashelp.class(where=(sex="F"));
run;
 
proc sgplot data=sashelp.class(where=(sex="F"));
scatter x=age y=height;
run;
 
ods excel options(sheet_interval="now");
 
 
proc print data=sashelp.class(where=(sex="M"));
run;
 
proc sgplot data=sashelp.class(where=(sex="M"));
scatter x=age y=height;
run;
 
ods excel close;

Output

Learn more

Control the location of tables, graphs, and text with ODS Excel was published on SAS Users.

4月 242019
 
The ODS Excel destination, which became a production feature in SAS 9.4M3 (TS1M3), generates Microsoft Excel workbooks in native XLSX format. This destination generates multiple worksheets per workbook with each output object (e.g., a table or graphic) the destination encounters by default. The ODS Excel destination is also flexible, enabling you to vertically control the worksheet and place output objects wherever you want. This blog demonstrates the destination’s flexibility and how you can modify its default behavior by using the ODS EXCEL statement's SHEET_INTERVAL= option.

Adding tables and graphics on the same Microsoft Excel worksheet

By default, the ODS Excel destination adds a new worksheet for each table and graphic. However, at times, you might not want to use this default behavior. If you want more control over this, the SHEET_INTERVAL= ODS Excel option determines when a new worksheet is created. Valid values for the SHEET_INTERVAL= option include:

  • TABLE (the default value) - new sheet for each table in output
  • NONE - keep the output that follows on the current sheet
  • PAGE - new sheet for each page of SAS output
  • PROC - new sheet beginning at the PROC step boundary
  • BYGROUP - new sheet for each BY group of output
  • NOW - begin a new sheet immediately

The value NOW, new for SAS 9.4M5 (TS1M5), triggers the creation of a new worksheet after the destination encounters the next output object.
As an example of opting not to use the default behavior, consider a case where you have a CONTENTS procedure without any options. This procedure generates three separate worksheets with the data-set attributes, the engine host information, and the variable list.

The following table shows the default output that you receive with three individual worksheets:

However, if you want to place all three objects on a single worksheet, you can do that by setting the option SHEET_INTERVAL="NONE". The option setting SHEET_INTERVAL="PROC" could also be used in this example which would create a new worksheet only when a new procedure is encountered.
The following example illustrates how to use this option to include all your output on the same worksheet:

ods excel file="c:\test.xlsx" options sheet_interval="none");
proc contents data=sashelp.class;
run; 
ods excel close;

Output

Adding text and tables to a new worksheet

Two of the most popular ways to add text on worksheets are to use either the ODS TEXT= statement or the ODSTEXT procedure with the Excel destination. The following example adds text to a worksheet by using the ODS TEXT= statement. You include this statement before each PRINT procedure in this example:

ods excel file="c:\temp\test.xlsx";
ods excel options(sheet_name="Females");
 
ods text="Data for Female Patients";
proc print data=sashelp.class(where=(sex="F"));
run;
 
ods excel options(sheet_name="Males");
ods text="Data for Male Patients";
 
proc print data=sashelp.class(where=(sex="M"));
run;
 
ods excel close;

Output

Notice that the first text string appears at the top of the first worksheet as expected. However, the text from the second ODS TEXT= statement appears at the bottom of this same page rather than at the top of the next worksheet containing the related data table. This behavior illustrates that the ODS TEXT= option is not very flexible. There is no good way to solve this issue.

However, you can use the SAS 9.4 ODSTEXT procedure in combination with the SHEET_INTERVAL= option to move the text string to the appropriate worksheet.
The following example uses PROC ODSTEXT and the SHEET_INTERVAL= option to move the text string "Statistics for Male Patients" to the top of the second worksheet:

ods excel file="c:\temp\test.xlsx";
ods excel options(sheet_name="Females");
 
ods text="Data for Female Patients";
proc print data=sashelp.class(where=(sex="F"));
run;
 
ods excel options(sheet_name="Males" sheet_interval="now");
ods excel options(sheet_name="Males" sheet_interval="none");
 
proc odstext;
   p "Data for Male Patients";
run;
 
proc print data=sashelp.class(where=(sex="M"));
run;
 
ods excel close;

Output

Adding multiple tables or graphs on the same worksheet

This final example demonstrates how you can use the SHEET_INTERVAL= option to add multiple tables and graphics to the same Excel worksheet. First, we use the SHEET_INTERVAL="NONE" option in the first ODS EXCEL statement to place the first table and graph on the same worksheet. Then, the SHEET_INTERVAL="NOW" option is included in the second ODS EXCEL statement option to create a second worksheet and write the second table and graph to that worksheet:

ods graphics / height=2.5in width=3.5in;
ods excel file="c:\scratch\test.xlsx" options(sheet_interval="none");
 
proc print data=sashelp.class(where=(sex="F"));
run;
 
proc sgplot data=sashelp.class(where=(sex="F"));
scatter x=age y=height;
run;
 
ods excel options(sheet_interval="now");
 
 
proc print data=sashelp.class(where=(sex="M"));
run;
 
proc sgplot data=sashelp.class(where=(sex="M"));
scatter x=age y=height;
run;
 
ods excel close;

Output

Learn more

Control the location of tables, graphs, and text with ODS Excel was published on SAS Users.

4月 222019
 

Imagine a world where satisfying human-computer dialogues exist. With the resurgence of interest in natural language processing (NLP) and understanding (NLU) – that day may not be far off.

In order to provide more satisfying interactions with machines, researchers are designing smart systems that use artificial intelligence (AI) to develop better understanding of human requests and intent.

Last year, OpenAI used a machine learning technique called reinforcement learning to teach agents to design their own language. The AI agents were given a simple set of words and the ability to communicate with each other. They were then given a set of goals that were best achieved by cooperating (communicating) with other agents. The agents independently developed a simple ‘grounded’ language.

Grounded vs. inferred language


Human language is said to be grounded in experience. People grasp the meaning of many basic words by interaction – not by learning dictionary definitions by rote. They develop understanding in terms of sensory experience -- for example, words like red, heavy, above.

Abstract word meanings are built in relation to more concretely grounded terms. Grounding allows humans to acquire and understand words and sentences in context.

The opposite of a grounded language is an inferred language. Inferred languages derive meaning from the words themselves and not what they represent. In AI trained only on textual data, but not real-world representations, these methods lack true understanding of what the words mean.

What if the AI agent develops its own language we can’t understand?

It happens. Even if the researcher gives the agents simple English words the agent inevitably diverges to its own, unintelligible language. Recently researchers at Facebook, Google and OpenAI all experienced this phenomenon!

Agents are reward driven. If there is no reward for using English (or human language) then the agents will develop a more efficient shorthand for themselves.

That’s cool – why is that a problem?

When researchers at the Facebook Artificial Intelligence Research lab designed chatbots to negotiate with one another using machine learning, they had to tweak one of their models because otherwise the bot-to-bot conversation “led to divergence from human language as the agents developed their own language for negotiating.” They had to use what’s called a fixed supervised model instead.

The problem, there, is transparency. Machine learning techniques such as deep learning are black box technologies. A lot of data is fed into the AI, in this case a neural network, to train on and develop its own rules. The model is then fed new data which is used to spit out answers or information. The black box analogy is used because it is very hard, if not impossible in complex models, to know exactly how the AI derives the output (answers). If AI develops its own languages when talking to other AI, the transparency problem compounds. How can we fully trust an AI when we can’t follow how it is making its decisions and what it is telling other AI?

But it does demonstrate how machines are redefining people’s understanding of so many realms once believed to be exclusively human—like language. The Facebook researchers concluded that it offered a fascinating insight to human and machine language. The bots also proved to be very good negotiators, developing intelligent negotiating strategies.

These new insights, in turn, lead to smarter chatbots that have a greater understanding of the real world and the context of human dialog.

At SAS, we’re developing different ways to incorporate chatbots into business dashboards or analytics platforms. These capabilities have the potential to expand the audience for analytics results and attract new and less technical users.

“Chatbots are a key technology that could allow people to consume analytics without realizing that’s what they’re doing,” says Oliver Schabenberger, SAS Executive Vice President, Chief Operating Officer and Chief Technology Officer in a recent SAS Insights article. “Chatbots create a humanlike interaction that makes results accessible to all.” The evolution of NLP toward NLU has a lot of important implications for businesses and consumers alike.

Satisfying human-computer dialogues will soon exist, and will have applications in medicine, law, and the classroom-to name but a few. As the volume of unstructured information continues to grow exponentially, we will benefit from AI’s tireless ability to help us make sense of it all.

Further Resources:
Natural Language Processing: What it is and why it matters
White paper: Text Analytics for Executives: What Can Text Analytics Do for Your Organization?
SAS® Text Analytics for Business Applications: Concept Rules for Information Extraction Models, by Teresa Jade, Biljana Belamaric Wilsey, and Michael Wallis
Unstructured Data Analysis: Entity Resolution and Regular Expressions in SAS®, by Matthew Windham
SAS: What are chatbots?
Blog: Let’s chat about chatbots, by Wayne Thompson

Moving from natural language processing to natural language understanding was published on SAS Users.

4月 162019
 

This blog post is based on the Code Snippets tutorial video in the free SAS® Viya® Enablement course from SAS Education. Keep reading to learn more about code snippets or check out the video to follow along with the tutorial in real-time.

Has there ever been a block of code that you use so infrequently that you always seem to forget the options that you need? Conversely, has there ever been a block of code that you use so frequently that you grow tired of typing it all the time? Code snippets can greatly assist with both of these scenarios. In this blog post, we discuss using pre-installed code snippets and creating new code snippets within SAS Viya.

Pre-installed code snippets

Figure 1: Pre-installed Snippets

SAS Viya comes with several code snippets pre-installed, including snippets to connect to CAS. To access these snippets, expand the Snippets area on the left navigation panel of SAS Studio as shown in Figure 1. You can see that the snippets are divided into categories, making it easier to find them.

If you double-click a pre-installed code snippet, or if you click and drag the snippet into the code editor panel, then the snippet will appear in the panel.

Snippets can range from very simple to very complex. Some contain comments. Some contain macro variables. Some might be only a couple of lines of code. That is the advantage of snippets. They can be anything that you want them to be.

 

 

Create new snippets

Now, let’s create a snippet of our own. Figure 2 shows an example of code that calls PROC CARDINALITY. This code is complete and fully executable. When you have the code the way that you want in your code window, click on the shortcut button for Add to My Snippets above the code. The button is outlined in a box in Figure 2.

Figure 2: Add to My Snippets Button

A window will appear that asks you to name the snippet. Naming the snippet then saves it into the My Snippets area in the left navigation panel for future use.

Remember that snippets are extremely flexible. The code that you save does not have to be fully executable. Instead of supplying the data source in your code, you may instead include notes or comments about what needs to be added, which makes the code more general, but it is still a very useful snippet.

To use one of your saved snippets, simply navigate to the My Snippets area, then double-click on your snippet or drag it into the code window.

Want to learn more about SAS Viya? Download the free e-book Exploring SAS® Viya®: Programming and Data Management. The content in this e-book is based on SAS® Viya® Enablement," a free course available from SAS Education.

Using code snippets in SAS® Viya® was published on SAS Users.

4月 122019
 

At the end of my SAS Users blog post explaining how to install SAS Viya on the Azure Cloud for a SAS Hackathon in the Nordics, I promised to provide some technical background. I ended up with only one manual step by launching a shell script from a Linux machine and from there the whole process kicked off. In this post, I explain how we managed to automate this process as much as possible. Read on to discover the details of the script.

Pre-requisite

The script uses the Azure command-line interface (CLI) heavily. The CLI is Microsoft's cross-platform command-line experience for managing Azure resources. Make sure the CLI is installed, otherwise you cannot use the script.

The deployment process

The process contains three different steps:

  1. Test the availability of the SAS Viya installation repository.
  2. Launch a new Azure Virtual Machine. This action uses a previously created custom Azure image.
  3. Perform the actual installation.

Let’s examine the details of each step.

Test the availability of the SAS Viya installation repository

When deploying software in the cloud, Red Hat Enterprise Linux recommends using a mirror repository. Since the SAS Viya package allows for this installation method, we decided to use the mirror for the hackathon images. This is optional, but optimal, say if your deployment does not have access to the Internet or if you must always deploy the same version of software (such as for regulatory reasons or for testing/production purposes).

In our Azure Subscription we created an Azure Resource group with the name ‘Nordics Hackathon.’ Within that resource group, there is an Azure VM running a web server hosting the downloaded SAS Viya repository.

Azure VM running HTTPD Server and hosting a SAS Viya Mirror Repository

Of course, we cannot start the SAS Viya installation before being sure this VM – hosting all rpms to install SAS Viya – is running.
To validate that the VM is running, we issue the start command from the CLI:

az vm start -g [Azure Resource Group] -n [AZ VM name]

Something like:

az vm start -g my_resourcegroup -n my_viyarepo34

If the server is already running, nothing happens. If not, the command starts the VM. We can also check the Azure console:

Azure Console with 'Running' VMs

Launching the VM

The second part of the script launches a new Azure VM. We use the custom Azure image we created earlier. The SAS Viya image creation is explained in the first blog post.

The Azure image used for the Nordics hackathon was the template for all other SAS Viya installations. On this Azure image we completed several valuable tasks:

  • We performed a SAS Viya pre-deployment assessment using the SAS Viya Administration Resource Kit (Viya ARK) utility tool. The Viya ARK - Pre-installation Playbook is a great tool that checks all prerequisites and performs many pre-deployment tasks before deploying SAS Viya software.
  • Installed R-Server and R-Studio
  • Installed Ansible
  • Created a SAS Viya Playbook using the SAS Orchestration CLI.
  • Customized Ansible playbooks created by SAS colleagues used to kickoff OpenLdap & JupyterHub installation.

Every time we launch our script, an exact copy of a new Azure Virtual machine launches, fully customized according to our needs for the Hackathon.
Below is the Azure CLI command used in the script which creates a new Azure VM.

az vm create --resource-group [Azure Resource Group]--name $NAME --image viya_Base \
--admin-username azureuser --admin-password [your_pw] --subnet [subnet_id] \
--nsg [optional existing network security group] --public-ip-address-allocation static \
--size [any Azure size] --tags name=$NAME

After the creation of the VM, we install SAS Viya in the third step of the process.

Installation

After running the script three times (using a different value for $NAME), we end up with the following high-level infrastructure:

SAS Viya on Azure Cloud deployemnt

After the launch of the Azure VM, the viya-install.sh script starts the install script using the original image in the /opt/sas/install/ location.
In the last step of the deployment process, the script installs OpenLdap, SAS Viya and JupyterHub. The following command runs the script:

az vm run-command invoke -g [Azure Resource Group] -n $NAME --command-id RunShellScript --scripts "sudo /opt/sas/install/viya-install.sh &"

The steps in the script should be familiar to those with experience installing SAS Viya and/or Ansible playbooks. Below is the script in its entirety.

#!/bin/bash
touch /start
####################################################################
echo "Starting with the installation of OPENLDAP. Check the openldap.log in the playbook directory for more information" > /var/log/myScriptLog.txt
####################################################################
# install openldap
cd /opt/sas/install/OpenLDAP
ansible-playbook openldapsetup.yml
if [ $? -ne 0 ]; then { echo "Failed the openldap setup, aborting." ; exit 1; } fi
cp ./sitedefault.yml /opt/sas/install/sas_viya_playbook/roles/consul/files/sitedefault.yml
if [ $? -ne 0 ]; then { echo "Failed to copy file, aborting." ; exit 1; } fi
####################################################################
echo "Starting Viya installation" >> /var/log/myScriptLog.txt
####################################################################
# install viya
cd /opt/sas/install/sas_viya_playbook
ansible-playbook site.yml
if [ $? -ne 0 ]; then { echo "Failed to install sas viya, aborting." ; exit 1; } fi
####################################################################
echo "Starting jupyterhub installation" >> /var/log/myScriptLog.txt
####################################################################
# install jupyterhub
cd /opt/sas/install/jupy-azure
ansible-playbook deploy_jupyter.yml
if [ $? -ne 0 ]; then { echo "Failed to install jupyterhub, aborting." ; exit 1; } fi
####################################################################
touch /finish 
####################################################################

Up next

In a future blog, I hope to show you how get up and running with SAS Viya Azure Quick Start. For now, the details I provided in this and the previous blog post is enough to get you started deploying your own SAS Viya environments in the cloud.

Script for a SAS Viya installation on Azure in just one click was published on SAS Users.

4月 092019
 

Natural language understanding (NLU) is a subfield of natural language processing (NLP) that enables machine reading comprehension. While both understand human language, NLU goes beyond the structural understanding of language to interpret intent, resolve context and word ambiguity, and even generate human language on its own. NLU is designed for communicating with non-programmers – to understand their intent and act on it. NLU algorithms tackle the extremely complex problem of semantic interpretation – that is, understanding the intended meaning of spoken or written language, with all the subtleties, of human error, such as mispronunciations or fragmented sentences.

How does it work?

After your data has been analyzed by NLP to identify parts of speech, etc., NLU utilizes context to discern meaning of fragmented and run-on sentences to execute intent. For example, imagine a voice command to Siri or Alexa:

Siri / Alexa play me a …. um song by ... um …. oh I don’t know …. that band I like …. the one you played yesterday …. The Beach Boys … no the bass player … Dick something …

What are the chances of Siri / Alexa playing a song by Dick Dale? That’s where NLU comes in.

NLU reduces the human speech (or text) into a structured ontology – a data model comprising of a formal explicit definition of the semantics (meaning) and pragmatics (purpose or goal). The algorithms pull out such things as intent, timing, location and sentiment.

The above example might break down into:

Play song [intent] / yesterday [timing] / Beach Boys [artist] / bass player [artist] / Dick [artist]

By piecing together this information you might just get the song you want!

NLU has many important implications for businesses and consumers alike. Here are some common applications:

    Conversational interfaces – BOTs that can enhance the customer experience and deliver efficiency.
    Virtual assistants – natural language powered, allowing for easy engagement using natural dialogue.
    Call steering – allowing customers to explain, in their own words, why they are calling rather than going through predefined menus.
    Smart listener – allowing users to optimize speech output applications.
    Information summarization – algorithms that can ‘read’ long documents and summarize the meaning and/or sentiment.
    Pre-processing for machine learning (ML) – the information extracted can then be fed into a machine learning recommendation engine or predictive model. For example, NLU and ML are used to sift through novels to predict which would make hit movies at the box office!

Imagine the power of an algorithm that can understand the meaning and nuance of human language in many contexts, from medicine to law to the classroom. As the volumes of unstructured information continue to grow exponentially, we will benefit from computers’ tireless ability to help us make sense of it all.

Further Resources:
Natural Language Processing: What it is and why it matters

White paper: Text Analytics for Executives: What Can Text Analytics Do for Your Organization?

SAS® Text Analytics for Business Applications: Concept Rules for Information Extraction Models, by Teresa Jade, Biljana Belamaric Wilsey, and Michael Wallis

Unstructured Data Analysis: Entity Resolution and Regular Expressions in SAS®, by Matthew Windham

So, you’ve figured out NLP but what’s NLU? was published on SAS Users.

4月 082019
 


As word spreads that SAS integrates with open source technologies, people are beginning to explore how to connect, interact with, and use SAS in new ways. More and more users are examining the possibilities and with this comes questions like: How do I code A, integrate B, and accomplish C?

Documentation is plentiful but is undergoing a makeover. People aren’t sure where to go for help – and that's why we're launching the SAS Developers Community, where you can gather to ask questions and get answers.

The community will mirror the activities in existing SAS Communities: Q&A, library articles, tips, technical discussions, etc. We migrated some content from other boards. For example, we moved the content from the Coding on SAS Viya board to the new community. Additionally, we scoured other boards for content that may be better aligned with developers and moved it. We also created some original content. Any good community needs participation by all, so read on and get the 411 on the new Developers Community.

Who is the target audience?

Developers – data scientists, application developers, analysts, programmers and administrators – who need to access SAS resources and/or run SAS procedures. This audience may or may not have SAS programming skills but need to access and analyze data using SAS.

What can developers expect to find?

The Developers Community provides a forum for collaboration, Q&A, and knowledge and resource sharing. The focus will be on developers using open source languages and technology. The community will create synergy between communities.sas.com, developer.sas.com, and github.com/sassoftware. SAS employees and external users will post how-to articles and other items of interest in the library section of the community. This community will not replace the SAS Programming Communities, rather, it will fill a void for non-SAS programmers who have a need/desire to interact with SAS.

When will the community launch?

The Developers Community is live! The site is public, and we've moved existing artifacts to the community. I am attending SAS Global Forum and will be available to answer questions about the new community from our booth in the Quad. Come by and see me!

Where will the community live?

The Developers Community exists on communities.sas.com, under the Developers Category.

Why do we need a community for developers?

Developers need a centralized place to share ideas, ask and answer questions, and discover resources. Currently developers lack a forum to work through things such as authentication, coding, API use, and integration issues. The community will encourage communication, engagement and leadership. Also, the Developers Community will be tightly integrated with the SAS Developers web site and SAS GitHub resources.

How do we go about creating the community?

After seeding the SAS Developer Community with existing discussions, we'll build out a group of SAS developer experts to help monitor the community. The true magic will happen as questions are asked, discussions transpire, and ideas are shared. But we need to your help too. Here is your call to action.

Share the community with your networks, buddies and even family members who may get something out of chatting it up about how to develop in SAS. The livelihood of the community hinges on user interaction. Our current and future users will thank you for it. And you may make a friend while you're at it.

Launching the Developers Community in SAS Communities was published on SAS Users.

4月 082019
 
The catch phrase “everything happens somewhere” is increasingly common these days.  That “somewhere” translates into a location on the Earth; a latitude and longitude.  When one of these “somewhere’s” is combined with many other “somewhere’s”, you quickly have a robust spatial data set that becomes actionable with the right analytic tools.

Opportunities for Spatial Analytics are increasing

In today’s modern world, GPS-enabled devices are ubiquitous, and their use continues to increase daily.  Cell phones, cars, fitness trackers, and cameras are all able to locate and track our position.  As a result, the location analytics market is expected to grow to over USD 16 Billion by 2021, up 17.6% from 2016 [1].

Waldo Tobler, an American-Swiss geographer and cartographer, developed his First Law of Geography based on this concept of everything happening somewhere.  He stated, “Everything is related to everything else, but near things are more related than distant things”[2].  As analytic professionals, we are accustomed to working with these correlations using scatterplots, heatmaps, or clustering models.  But what happens when we add a geographic map into the analysis?

Maps offer the ability to unlock a new level of insight into our data that traditional graphs do not offer: personal connection.  As humans, we naturally relate to our surroundings on a spatial level.   It helps build our perspective and frame of reference through which we view and navigate the world.  We feel a sense of loss when a physical landmark from our childhood – a building, tree, park, or route we used to walk to school – is destroyed or changed from the memories we have of it.  In this sense, we are connected, spatially and emotionally, to our surroundings.

We inherently understand how data relates to the world around us, at some level, just by viewing it on a map.  Whether it is a body of water or a mountain affecting a driving route or maybe a trendy area of a city causing housing prices to increase faster than the local average, a map connects us with these facts intuitively.  We come to these basic conclusions based solely on our experiences in the world and knowledge of the physical landmarks in the map.

One of the best examples of this is the 1854 Cholera outbreak in London.  Dr. John Snow was one of the first to use a map for understanding the origin of an epidemiological outbreak.  He created a map of the affected London neighborhood by plotting the location of all known Cholera deaths.  In addition to the deaths, he also plotted the location of 13 community wells that served as the public water supply.  Using this data, he was able to see a clustering of deaths around a single pump.  Armed with this information, Dr. Snow was able to convince local officials to remove the handle from the Broad Street pump.  Once removed, new cases of Cholera quickly began to diminish.  This helped prove his theory the outbreak’s origin was not air-borne as commonly believed during that time, but rather of a water-borne origin. [3]

1854 London Cholera deaths: Tabular data vs. Coordinate map [3]

Let’s look at how Dr. Snow’s map helped mitigate the outbreak and prove his theory.  The image above compares the data of the recorded deaths and community wells in tabular form to a Coordinate map.  It is obvious from the coordinate map that there is a clustering of points.  Town officials and those familiar with the neighborhood could easily get a sense of where the outbreak was concentrated.  The map told a better story by connecting their personal experience of the area to the locations of the deaths and ultimately to the wells.  Something a data table or traditional graph could not do.

Maps of London Cholera deaths with modern analytic overlays [3]

Today, with the computing power and modern analytic methods available to us, we can take the analysis even further.  The examples above show the same coordinate map with added Voronoi polygon and cluster analysis overlays.  The concentration around the Broad Street pump becomes even clearer, showing why Geographic Maps are an important tool to have in your analytic toolbox.

SAS Global Forum 2019 is being held April 28-May 1, 2019 in Dallas, Texas.  If you are planning to go to this year’s event, be sure to attend one of our presentations on the latest mapping features included in SAS Visual Analytics and BASE SAS.  While you’re there, don’t forget to stop by the SAS Mapping booth located in the QUAD to say ‘Hi!’ and let us help with your spatial data needs.  See you in Dallas!

Introduction to Esri Integration in SAS Visual Analytics

  • Monday, April 29, 4:30-5:30p, Room: Level 1, D162

There’s a Map for That! What’s New and Coming Soon in SAS Mapping Technologies

  • Tuesday April 30, 4:00-4:30p, Room: Level 1, D162

Creating Great Maps in ODS Graphics Using the SGMAP Procedure

  • Wednesday May 01, 11:30a-12:30p, Room: Level 1, D162

[1] https://www.marketsandmarkets.com/Market-Reports/location-analytics-market-177193456.html

[2] https://en.wikipedia.org/wiki/Tobler%27s_first_law_of_geography

[3] https://www1.udel.edu/johnmack/frec682/cholera/

How the 1854 Cholera outbreak showed us the importance of spatial analysis was published on SAS Users.

4月 052019
 

Recently, you may have heard about the release of the new SAS Analytics Cloud. The platform allows fast access to data-science applications in the cloud! Running on the SAS Cloud and using the latest container technology, Analytics Cloud eliminates the need to install, update, or maintain software or related infrastructure.

SAS Machine Learning on SAS Analytics Cloud is designed for SAS and open source data scientists to gain on-demand programmatic access to SAS Viya. All the algorithms provided by SAS Visual Data Mining and Machine Learning (VDMML), SAS Visual Statistics and SAS Visual Analytics are available through the offering. Developers and data scientists access SAS through a programming interface using either the SAS or Python programming languages.

A free trial for Analytics Cloud is available, and registration is simple. The trial environment allows users to manage and collaborate with others, share data, and create runtime models to analyze their data. The system is pre-loaded with sample data for learning, and allows users to upload their own data. My colleague Joe Furbee explains how to register for the trial and takes you on a tour of the system in his article, Zero to SAS in 60 Seconds- SAS Machine Learning on SAS Analytics Cloud.

Luckily, I had the privilege of being the technical writer for the documentation for SAS Analytics Cloud, and through this met two of my now close friends at SAS.

Alyssa Andrews (pictured left) and Mariah Bragg (pictured right) are both Software Developers at SAS, but worked on the UI for SAS Analytics Cloud. Mariah works in the Research and Development (R&D) division of SAS while Alyssa works in the Information Technology (IT) division. As you can see this project ended up being an interesting mix of SAS teams!

As Mariah told me the history, I learned that SAS Analytics Cloud “was a collaborative project between IT and R&D. The IT team presented the container technology idea to Dr. Goodnight but went to R&D because they wanted this idea run like an R&D project.”

As we prepared for the release of SAS Analytics Cloud to the public, I asked Mariah and Alyssa about their experience working on the UI for SAS Analytics Cloud, and about all the work that they had completed to bring this powerful platform to life!


What is SAS Analytics Cloud for you? How do you believe it will help SAS users?

Alyssa: For me, it is SAS getting to do Software as a Service. So now you can click on our SAS Software and it can magically run without having to add the complexity of shipping a technical support agent to the customers site to install a bunch of complex software.

Mariah: I agree. This will be a great opportunity for SAS to unify and have all our SAS products on cloud.

Alyssa: Now, you can trial and then pay for SAS products on the fly without having to go through any complexities.

What did you do on the project as UI Developers?

Alyssa: I was lent out to the SAS Analytics Cloud team from another team and given a tour-of-duty because I had a background in Django (a high-level Python Web design tool) which is another type of API framework you can build a UI on top of. Then I met Mariah, who came from an Angular background, and we decided to build the project on Angular. So, I would say Mariah was the lead developer and I was learning from her. She did more of the connecting to the API backend and building the store part out, and I did more of the tweaks and the overlays.

What is something you are proud of creating for SAS Analytics Cloud?

Mariah: I’m really proud to be a part of something that uses Angular. I think I was one of the first people to start using Angular at SAS and I am so excited that we have something out there that is using this new technology. I am also really proud of how our team works together, and I’m really proud of how we architectured the application. We went through multiple redesigns, but they were very manageable, and we really built and designed such that we could pull out components and modify parts without much stress.

Alyssa: That we implemented good design practices. It is a lot more work on the front-end, but it helps so much not to have just snowflake code (a term used by developers to describe code that isn’t reusable or extremely unique to where it becomes a problem later on and adds weight to the program) floating. Each piece of code is there for a reason, it’s very modular.

What are your hopes for the future of SAS Analytics Cloud?

Alyssa: I hope that it continues to grow and that we add even more applications to this new container technology, so that SAS can move even more into the cloud arena. I hope it brings success. It is a really cool platform, so I can’t wait to hear about users and their success with it.

Mariah:
I agree with Alyssa. I also hope it is successful so that we keep moving into the Cloud with SAS.

Learning more

As a Developmental Editor with SAS Press, it was a new and engaging experience to get to work with such an innovative technology like SAS Analytics Cloud. I was happy I got to work with such an exciting team and I also look forward to what is next for SAS Analytics Cloud.

And as a SAS Press team member, I hope you check out the new way to trial SAS Machine Learning with SAS Analytics Cloud. And while you are learning SAS, check out some of our great books that can help you get started with SAS Studio, like Ron Cody’s Biostatistics by Example Using SAS® Studio and also explore Geoff Der and Brian Everitt’s Essential Statistics Using SAS® University Edition.

Already experienced but want to know more about how to integrate R and Python into SAS? Check out Kevin D. Smith’s blogs on R and Python with SAS Viya. Also take a moment to investigate our new books on using open source R and Python with SAS Viya: SAS Viya: The R Perspective by Yue Qi, Kevin D. Smith, and XingXing Meng and SAS Viya: The Phyton Perspective by Kevin D. Smith and XingXing Meng.

These great books can set you on the right path to learning SAS before you begin your jump into SAS Analytics Cloud, the new way to experience SAS.

SAS® Analytics Cloud—an interview with the women involved was published on SAS Users.

4月 052019
 

You are a data scientist, in your office, doing data scientist-y things when, your manager's, manager's, manager makes an impossible request. She wants you take a raw data set from the stem cell research team, scrub the data, create and score models, and be ready to rescore when new data comes is available. And she wants it in a week. WHAT?! Your company doesn't own an analytics software license, and a spreadsheet is not going to work on this data with millions of records. Even if you received funding, how could you ever create and maintain an environment under your tight deadline? Take a deep breath, conjure your inner data scientist acumen, and realize SAS has the answer.

SAS Machine Learning on SAS Analytics Cloud provides on-demand programming access to machine learning algorithms in the cloud. No downloads, no install, no infrastructure, no maintenance. This solution provides a multithreaded, multiuser environment for concurrent access to data in memory. The solution is designed for data scientists (and others) coding in SAS or Python and allows them on-demand programmatic access to SAS Viya. You can find more details on Analytic Cloud in the fact sheet. You can even try it for free! The rest of this article will walk you through the features of this new SAS offering and outline how it can help you complete the task bestowed upon you.

Register and get started

Literally, to sign up for the trial, all you need are a SAS Profile, an email address, and a PC. You will be coding in SAS in less than a minute. From the SAS Cloud Analytics page, select the Get Free Trial button. This takes you to the SAS Profile login page (note you can create your SAS Profile here if you do not have one).

SAS Profile log in or creation

Agree to the Terms and Conditions on the License Agreement page and select the Continue button:

Trial License Agreement

You will receive an email containing a URL much like the following:

email confirmation with trial URL

Logging in

Select the link or paste it into your browser (Google Chrome 64-bit recommended) and you will see the log in screen. Enter your SAS Profile credentials and click the Sign In button.

Sign In screen

The Home screen (Applications) appears.

Home Page

We'll discuss the Data and Team pages in further detail later on in this article. You have two options for applications: SAS Studio (for SAS programming) and JupyterLab (for Python programming). This article focuses on SAS Studio. A follow up article will cover the JupyterLab use case. Select the SAS Studio button, a new tab opens to SAS Studio, and we're ready to start coding.

SAS Studio

You are familiar with the SAS language, but you need to brush up a little. Have no fear, support documentation is easily accessible. Also, the SAS Data Mining and Machine Learning Community is a great place to discover additional resources and ask questions. Finally, embedded in SAS Studio are code snippets. You decide to explore the latter.

Code snippets

In SAS Studio select the Snippets twisty in the left pane. Navigate to the SAS Viya Machine Learning section. Here you find code samples you will use to prep and analyze your data. When opening a snippet, you see code and detailed comments on what the code will accomplish. You will use these snippets as a guide when you load and prep your data and preform your analysis. Below is an image of the Prepare and Explore Data snippet. Notice each code step has accompanying comments.

You read through each snippet in the Machine Learning section. The command and structure of the code comes back to you pretty quickly and you're now ready to try it all out on your own data.

Uploading data

Now that you have an idea of what code you need to write, you need to load the data from the research department. You accomplish this by selecting the Server Files and Folders twisty and navigate to the Folder Shortcuts section. In this instance you want to upload your file into the shared/data directory (I'll explain why I chose this location in a moment). Use the Upload button to upload the research data file.

Upload file to the data directory

You're not alone

Files uploaded to shared/data are now visible by others logged into the environment. Wait, did I forget to mention this is a multi-user environment?! Well, yes, it is. You can invite others to collaborate on the project. To add and manage users, return to the Home screen (leaving SAS Studio open). Select the Team section in the left pane. The Team page lists users and displays an Invite button, used to send an invitation for system access to others.

Teams page

To invite others, click the button and enter the email address of the new user. This generates and sends an invitation in an email. The new user accepts the invite and now has access to the system. Using the URL provided in the email the new user logs in using their own SAS Profile credentials. The default role for new users is ‘User.’ A user with admin privileges can change the role to ‘Admin. In the free trial, you are permitted to have a total of five users.

Shared data

You may have guessed by now the Data section lists directories and files located in the shared directory in SAS Studio.

Data page

You also notice here you have 5 GB of storage space. This includes shared and non-shared files.

I love this. How do I get more?

Now you know your way around the system and are ready to start coding. Return to SAS Studio, open a new program, and commence your analysis of the stem cell data. When you successfully deliver the project and impress your management chain, you can mention how the SAS Analytics Cloud solution made it all possible (and simple). You now have a case for the departmental procurement of the solution opening your organization up to add more users, access more storage, and gain more power to run advanced machine learning algorithms on your data.

Your turn

In this article I've outlined how to easily register for the SAS Machine Learning trial and start coding in the matter of a minute. Try it out yourself. Register, load your data, get coding, and solve your problem.

Related Resources

For more details on the development of SAS Analytics Cloud, check out Missy Hannah's interview with two UI developers on the project.

Zero to SAS in 60 Seconds- SAS Machine Learning on SAS Analytics Cloud was published on SAS Users.