Analyzing ticket sales and customer data for large sports and entertainment events is a complex endeavor. But SAS Visual Analytics makes it easy, with location analytics, customer segmentation, predictive artificial intelligence (AI) capabilities – and more. This blog post covers a brief overview of these features by using a fictitious event company [...]
In SAS Visual Analytics 7.4 on 9.4M5 and SAS Visual Analytics 8.2 on SAS Viya, the periodic operators have a new additional parameter that controls how filtering on the date data item used in the calculation affects the calculated values.
The new parameter values are:
These parameter values enable you to improve the appearance of reports based on calculations that use periodic operators. You can have periods that produce missing values for periodic calculations removed from the report, but still available for use in the calculations for later periods. These parameter settings also enable you to provide users with a prompt for choosing the data to display in a report, without having any effect on the calculations themselves.
The following will illustrate the points above, using periodic Revenue calculations based on monthly data from the MEGA_CORP table. New aggregated measures representing Previous Month Revenue (RelativePeriod) and Same Month Last Year (ParallelPeriod) will be displayed as measures in a crosstab. The default _ApplyAllFilters_ is in effect for both, as shown below, but there are no current filters on report or objects.
The Change from Previous Month and Change From Same Month Last Year calculations, respectively, are below:
The resulting report is a crosstab with Date by Month and Product Line in the Row roles, and Revenue, along with the 4 aggregations, in the Column roles. All calculations are accurate, but of course, the calculations result in missing values for the first month (Jan2009) and for the first year (2009).
An improvement to the appearance of the report might be to only show Date by Month values beginning with Jan2010, where there are no non-missing values. Why not apply a filter to the crosstab (shown below), so that the interval shown is Jan2010 to the most recent date?
With the above filter applied to the crosstab, the result is shown below—same problem, different year!
This is where the new parameter on our periodic operators is useful. We would like to have all months used in the calculations, but only the months with non-missing values for both of the periodic calculations shown in the crosstab. So, edit both periodic calculations to change the default _ApplyAllFilters_ to _IgnoreAllTimeFrameFilters_, so that the filters will filter the data in the crosstab, but not for the calculations. When the report is refreshed, only the months with non-missing values are shown:
This periodic operator parameter is also useful if you want to enable users to select a specific month, for viewing only a subset of the crosstab results.
For a selection prompt, add a Drop-Down list to select a MONYY value and define a filter action from the Drop-Down list to the Crosstab. To prevent selection of a month value with missing calculation values, you will also want to apply a filter to the Drop-Down list as you did for the crosstab, displaying months Jan2010 and after in the list.
Now the user can select a month, with all calculations relative to that month displayed, shown in the examples below:
Note that, at this point, since you’ve added the action from the drop-down list to the crosstab, you actually no longer need the filter on the crosstab itself. In addition, if you remove the crosstab filter, then all of your filters will now be from prompts or actions, so you could use the _IgnoreInteractiveTimeFrameFilters_ parameter on your periodic calculations instead of the _IgnoreTimeFrameFilters_ parameter.
You will also notice that, in release 8.2 of SAS Visual Analytics that the performance of the periodic calculations has been greatly improved, with more of the work done in CAS.
Be sure to check out all of the periodic operators, documented here for SAS Visual Analytics 7.4 and SAS Visual Analytics filters on periodic calculations: Apply them or ignore them! was published on SAS Users.
SAS Visual Analytics 8.2 introduces the Hidden Data Role. This role can accept one or more category or date data items which will be included in the query results but will not be displayed with the object. You can use this Hidden Data Role in:
- Mapping Data Sources.
- Color-Mapped Display Rules.
- External Links.
Note that this Hidden Data Role is not available for all Objects and cannot be used as both a Hidden Data Role and Data tip value, it can only be assigned to one role.
In this example, we will look at how to use the Hidden Data Role for an External Link.
Here are a few applications of this example:
- You want to show an index of available assets, and you have a URL to point directly to that asset.
- Your company sells products, you want to show a table summary of product profit but have a URL that points to each Product’s development page.
- As the travel department, you want to see individual travel reports rolled up to owner, but have a URL that can link out to each individual report.
The applications are endless when applied to our customer needs.
In my blog example, I have NFL data for Super Bowl wins. I have attached two columns of URLs for demonstration purposes:
- One URL is for each Super Bowl event, so I have 52 URLs, one for each row of data.
- The second URL is for each winning team. There have been 20 unique Super Bowl winning teams, so I have 20 unique URLs.
In previous versions of SAS Visual Analytics, if you wanted to link out to one of these URLs, you would have to include it in the visualization like in the List Table shown above. But now, using SAS Visual Analytics 8.2, you can assign a column containing these URLs to the Hidden Data Role and it will be available as an External URL.
Here is our target report. We want to be able to link to the Winning Team’s website.
In Visual Analytics 8.2, for the List Table, assign the Winning Team URL column to the Hidden Data Role.
Then, for the List Table, create a new URL Link Action. Give the Action a name and leave the URL section blank. This is because my data column contains a fully qualified URL. If you were linking to a destination and only needed to append a name value pair, then you could put in the partial URL and pass the parameter value, but that’s a different example.
That is using the column which has 20 URLs that matches the winning team in the Hidden Data Role. Now, what if we use the column that has the 52 URLs that link out to the individual Super Bowl events?
That’s right, the cardinality of the Hidden Data Role item does impact the object. Even though the Hidden data item is not visible on the Object, remember it is included in the results query; and therefore, the cardinality of the Hidden data item impacts the aggregation of the data.
Notice that some objects will just present an information warning that a duplicate classification of the data has caused a conflict.
In conclusion, the Hidden Data Role is an exciting addition to the SAS Visual Analytics 8.2 release. I know you'll enjoy and benefit from it.
The power behind a Hidden Data Role in SAS Visual Analytics was published on SAS Users.
Let’s say that you are administering a SAS 9.4 environment that is working just fine. You’ve checked that your full backups are indeed happening and you’ve even tried restoring from one of your backups. You are prepared for anything, right? Well, I’d like to propose a scenario to you. You probably have users responsible for creating reports, maybe even very important reports. What if something happened to one of these reports? Perhaps the user wants to revert to an earlier version. Perhaps the report was accidentally deleted or even corrupted, what then? Restoring a full backup in this situation might help this one user but would likely inconvenience most other users. With a little more preparation, you could “magically” restore a single report if needed. Here’s what you need to do: create a backup of only these critical reports using the promotion tools.
The promotion tools include:
- the Export SAS Package Wizard and the Import SAS Package Wizard available in SAS Management Console, SAS Data Integration Studio, and SAS OLAP Cube Studio.
- the batch export tool and the batch import tool.
Note: Starting with the third maintenance of SAS 9.4, you can use the -disableX11 option to run the batch import and batch export tools on UNIX without setting the DISPLAY variable.
You can use the promotion tools on almost anything found in the SAS Folder tree, especially if you use SAS Management Console. If you use the wizards in SAS Data Integration Studio or SAS OLAP Cube Studio, those applications only allow you to access and export/import objects that pertain to that application, a subset of what is available in SAS Management Console.
You may be thinking that using an interactive wizard is not really the answer you are looking for and you may be right. The batch tools are a great solution if you want to schedule the exporting of some objects on a regular basis. If you are unfamiliar with the promotion tools, I would suggest you start with the interactive wizards. You will find that the log produced by the wizard includes the equivalent command line you would use. It’s a nice way to explore how to invoke the batch tools.
Creating the Export Package
How to invoke the Export SAS Package Wizard:
1. Right-click on a folder or object in the SAS Folders tree and select Export SAS Package.
2. Enter the location and name of the package file to be created and set options as appropriate.
You can opt to Include dependent objects when retrieving initial collection of objects here or you can select specific dependent objects on the next screen.
Filtering offers some very interesting ways of selecting objects including:
- By object name
- By object type
- By when objects were created
- By when objects were last modified
3. Select the objects to export. If you started the process with a folder, you will be presented with the folder and all of its contents selected by default. You can deselect specific objects as you like.
In this example, we only want the Marketing folder and its contents. Deselect the other folders. You want to be careful to not create a package file that is too big.
You can click on individual objects and explore what dependencies the object has, what other metadata objects use the current object, options and properties for the object.
In this example, the Marketing Unit Report is dependent on the MEGACORP table whose metadata is found in the /Shared Data/LASR Data folder. When you import this report, you will need to associate the report with the same or similar table in order for the report to be fully functional.
If you had selected Include dependent objects when retrieving initial collection of objects on the previous screen, all of the dependent objects would be listed and be selected for export by default.
Bonus things you get by default in the export package include:
- Permissions set directly on the objects
- For most object types, the export tools include both metadata and the associated physical content. For example, with reports you get both the metadata and associated report XML. For a complete list of physical content promoted with metadata objects, refer to:
5. When the export process is complete (hopefully without errors) review the log.
At the top of the log, you can see the location of the log file in case you want to refer to it later.
If you scroll to the end of the log, you’ll find the command line to invoke the batch export tool to create the same package.
Considerations for Exporting
- What account should you use to do the export or import process? The answer is an account that has enough access to metadata and when including items stored on the operating system the ability to run a workspace server and permissions to the physical resources in question. For more details, review the documentation shows you how to adjust the memory if you need to. You can also create multiple, smaller, packages as an alternative to one very large package.
- Does it matter if you grab all of the dependencies when you export or not? It really depends on what’s available when you import objects.
- You do not have to initiate the export from the SAS Folders folder. You can initiate the export at a specific folder, or a specific object, or group of objects.
Importing to the Rescue
Let’s talk about what happens if and when you actually need to import some or all of the objects in a package file.
Let’s take a look at what we would need to do to replace an accidentally deleted report, Marketing Unit Report.
How to invoke the Import SAS Package Wizard:
5. Right-click on the same folder you started the export, SAS Folders folder in our example, and select Import SAS Package. It is important to initiate the import from the same folder you started the export if you want to end up with the same folder structure.
6. If needed, use the Browse functionality to locate the correct package file.
Include access controls
By default, Include access controls is not selected. This option will import permission settings directly applied to the objects in the package. It will not import any permissions if there were only inherited permissions on the object in the source environment.
Since we are bringing the report back into the folder it originally came from, it makes sense to also include direct permissions, if there were any.
If you do not check the Include access controls box and there are in face some direct permissions on objects being imported, you will get this warning later in the wizard:
Select objects to import
If you’re not sure whether to select to import All objects or New objects only, you can always start with all objects. You can use the Back buttons in the wizard to go back to previous prompts and change selections, at least before you kick off the actual import process.
7. If you selected import all objects on the first screen, you will see a listing of all objects. Each object will have an icon indicating if the object currently exists where you are doing the import or not. The red exclamation mark indicates the object currently exists and doing the import of this object will overwrite the current object with the copy from the package. The asterisk icon indicates that the object does not currently exist and will be created by the import process.
In our example, the Marketing Unit Report does not currently exist in the Marketing folder but is in the package file so it is labeled with an asterisk. The other two reports are both in the folder and the package file so they are labeled with red exclamation marks.
You’ll want to make the appropriate selections here. If you want all of the contents of the package to be written to the Marketing folder, overwriting the first two reports and adding the Marketing Unit Report, leave all objects selected. If one of the reports had become corrupted, you could use this method to overwrite the current copy with the version stored in the package file.
If you just want to replace the missing Marketing Unit Report, make sure only that object is selected as below:
By default, objects are imported into the same folder structure they were in when the export package was created.
8. Part of the import process is to establish associations between the objects you are importing and metadata not included in the package. You are first presented with a list of the metadata values you will need to select.
9. Set the target value(s) as needed.
In our example, we definitely want the report to use the same table it used originally.
If we were moving objects to a new folder or a new environment, you might want to associate the report with a different table.
If you use the batch import tool, changing these associations would be done in a substitution properties file.
10. Review the import summary and initiate the import process.
11. Hopefully, the process completes without errors and you can review the log.
12. Finish things off by testing the content you imported. In this case, we would log in to SAS Visual Analytics and view the Marketing Unit Report.
Considerations for Importing
- If you initiated the export from the SAS Folders folder and try to import the package from another folder, Marketing for example, the wizard will recreate everything in the package, including a new Marketing subfolder which is probably not what you intended.
Notice the new Marketing folder inside the current Marketing folder. In addition, all three reports are considered new since the new Marketing subfolder does not currently exist.
- The account you use to do the import should have enough access to metadata and the operating system.
- Decide what you want to export, how often, and how long you want to keep a specific package file.
- Once you’ve gotten comfortable with the wizards and you want to schedule an export (or several), you should try out the batch export and import tools. When you name the export package, you can consider customizing the package name to include the date to avoid overwriting the same package file each time.
Review the documentation on both the wizards and batch tools in the Admin Notebook: Making the case for selectively backing up metadata was published on SAS Users.
By default, SAS Visual Analytics 7.4 supports country and state level polygons for regional geomaps. In SAS Visual Analytics 7.4, custom shape files are now supported, as well. This means that if a site has their own custom polygon data that defines custom regions, it’s possible to create a region geomap that displays those regions.
Implementing the process requires completing some preparatory steps, explicitly execution of some SAS code, but the steps are explained in Appendix 2 of the SAS Visual Analytics 7.4: Administration Guide. The SAS program that completes the steps is provided for download at http://support.sas.com/rnd/datavisualization/vageo/va74polygons.sas.
Two examples using the program are provided in Appendix 2 for US counties and German provinces. The instructions in Appendix 2 assume that the custom polygon data is provided in ESRI shape file format, which is likely the most common use-case. The site will need access to a SAS programming environment and SAS/GRAPH software, and whoever completes the process will need access to the SAS Visual Analytics configuration directory and the ability to restart services—so an administrator-type person will be required.
One common request is to provide a regional geomap, where the regions are site-defined groups of states or provinces of a country. In this example problem, the site has sales data for each sales region in the US and would like to display a geo map of the regions.
For this type of region/province example, you will likely be able to use one of the maps already provided by SAS in the MAPSGFK library to produce your region boundaries. For more information on the datasets in the MAPSGFK library, see this paper.
The MAPSGFK.US_STATES dataset contains the data required to overlay all states of the US on a VA region geomap and has these columns:
The highlighted columns, STATECODE, LONG, and LAT will be particularly useful, but first, the sales region (REGION) column and values must be added using simple data step code. The unnecessary FIPS code (STATE) can be dropped in the same DATA step. Note that the region values are assigned in upper case, as these will later be converted to ID values, which VA expects to be in upper case.
data regions; length region $ 12; drop state; set mapsgfk.us_states; if statecode in ('AK','HI','PR') then delete; else if statecode in ('WA','MT','OR','ID','WY') then region='NORTHWEST'; else if statecode in ('CA','NV','UT','AZ','CO','NM') then region='SOUTHWEST'; else if statecode in ('ND','SD','NE','MN','WI','MI','IA','IL','IN') then region='NORTHCENTRAL'; else if statecode in ('KS','OK','TX','MO','AR') then region='SOUTHCENTRAL'; else if statecode in ('ME','NH','VT','MA','RI','CT','NY','PA','NJ','OH','DE', 'MD','DC')then region='NORTHEAST'; else if statecode in ('KY','WV','VA','TN','NC','MS','AL','LA','GA','SC','FL') then region='SOUTHEAST'; run;
The data is then sorted by the REGION values, a requirement of the SAS/GRAPH GREMOVE procedure, which is used to remove the internal state boundary data points, leaving the region boundary points only.
proc sort data=regions; by region; proc gremove data=regions out=mapscstm.regions1; by region; id statecode; run;
To complete the process, since the LAT and LONG values are already in the form that VA needs (unprojected) and we are using a SAS dataset rather than the ESRI shape file format, we’ll only use a part of the code from the downloadable program mentioned at the beginning of the blog.
First, create a mapscstm directory under /SASHome/SASFoundation/9.4 to store the custom polygon dataset. Make sure that the library is accessible to the SAS session by including a libname statement in the appserver_autoexec_usermods.sas file, found in config/Lev1/SASApp, and then restarting the Object Spawner.
libname MAPSCSTM “SASHome/SASFoundation/9.4/mapscstm”;
Tip: Be sure to back up the original ATTRLOOKUP and CENTLOOKUP datasets before running any additional code, as you will be modifying the originals.
To complete creation of the polygon dataset, you will need to execute only a part of the downloadable program to:
• Make sure that your polygon dataset has all of the columns expected by SAS Visual Analytics.
• Add the region attributes to the ATTRLOOKUP.
• Add the region center point locations to the CENTLOOKUP dataset.
%let REGION_LABEL=USRegions; /* The label for the custom region */ %let REGION_PREFIX=R1; /* unique ISO 2-Letter Code */ %let REGION_ISO=000; /* unique ISO Code */ %let REGION_DATASET=MAPSCSTM.REGIONS1; /* Polygon data set to be created - be sure to use suffix "1" */
Note that the downloadable program includes additional macro assignments and additional code, but since our data is already in the form of a SAS dataset, rather than ESRI shape file format, we won’t be using all of the code.
The following datastep adds the necessary columns/values to the polygon dataset so that the form of the data is what is expected by VA. Note that the LAT and LONG columns are already in unprojected form, so we just assign the same values to X and Y. (VA doesn’t actually use the X,Y columns from the polygon dataset.)
data ®ION_DATASET.; set ®ION_DATASET.; where density <= 3; id=region; idname=region; x=long; y=lat; ISO = "®ION_ISO."; RESOLUTION = 1; LAKE = 0; ISOALPHA2 = "®ION_PREFIX."; AdminType = "regions"; keep ID SEGMENT IDNAME LONG LAT X Y ISO DENSITY RESOLUTION LAKE ISOALPHA2 AdminType; run;
Then PROC SQL steps are executed to add rows relative to the custom polygons to the ATTRLOOKUP and CENTLOOKUP datasets:
This step adds the USRegions row to ATTRLOOKUP:
proc sql; insert into valib.attrlookup values ( "®ION_LABEL.", /* IDLABEL=State/Province Label */ "®ION_PREFIX.", /* ID=SAS Map ID Value */ "®ION_LABEL.", /* IDNAME=State/Province Name */ "", /* ID1NAME=Country Name */ "", /* ID2NAME */ "®ION_ISO.", /* ISO=Country ISO Numeric Code */ "®ION_LABEL.", /* ISONAME */ "®ION_LABEL.", /* KEY */ "", /* ID1=Country ISO 2-Letter Code */ "", /* ID2 */ "", /* ID3 */ "", /* ID3NAME */ 0 /* LEVEL (0=country level, 1=state level) */ ); quit;
This step adds a row to ATTRLOOKUP for each individual region:
proc sql; insert into valib.attrlookup select distinct IDNAME, /* IDLABEL=State/Province Label */ ID, /* ID=SAS Map ID Value */ IDNAME, /* IDNAME=State/Province Name */ "®ION_LABEL.", /* ID1NAME=Country Name */ "", /* ID2NAME */ "®ION_ISO.", /* ISO=Country ISO Numeric Code */ "®ION_LABEL.", /* ISONAME */ trim(IDNAME) || "|®ION_LABEL.", /* KEY */ "®ION_PREFIX.", /* ID1=Country ISO 2-Letter Code */ "", /* ID2 */ "", /* ID3 */ "", /* ID3NAME */ 1 /* LEVEL (1=state level) */ from ®ION_DATASET.; quit;
This step calculates and adds the central location point for each of the regions to the CENTLOOKUP dataset. The site data contains only the 48 contiguous states (no Alaska or Hawaii). If Alaska and Hawaii had been included, a different algorithm would need to be used to calculate the central location.
proc sql; /* Add custom region */ insert into valib.centlookup select distinct "®ION_DATASET." as mapname, "®ION_PREFIX." as ID, avg(x) as x, avg(y) as y from ®ION_DATASET.; /* Add custom provinces */ insert into valib.centlookup select distinct "®ION_DATASET." as mapname, ID as ID, avg(x) as x, avg(y) as y from ®ION_DATASET. group by id; quit;
After executing the code above, you will need to restart the Web Application server, so that SAS Visual Analytics has access to the new polygons.
Code is also included in the downloadable program to create a dataset for validating your results. The validate dataset includes a column for the ID and IDNAME of the regions, in addition to two randomly calculated measures. In our case, we will instead just use our original REGIONSALES dataset containing the regional sales data.
1. Sign into SAS Visual Analytics and create a new exploration with data source REGIONSALES.
2. Create a Geo data item from State: Right-click Regions, select Geography?Subdivision(State, Province) Names. From the Country or Region drop-down list, select the USRegions region label.
3. Create a geo map visualization. Select Regions for the map style, Regions for the Geography role, and salesamt for the Color role.
Your regions should display, similar to this:
You can also include the region data item in a hierarchy with the state data item to produce a drill-down region map:
Or a bubble or coordinate map:
I hope this example has been helpful to users of SAS Visual Analytics 7.4. In my next blog, you will see that this process is tremendously simplified by new mapping features in SAS Visual Analytics 8.2.
Creating a custom regional map in SAS Visual Analytics 7.4 was published on SAS Users.
Another report requirement came my way and I wanted to share how to use our Visual Analytics’ out-of-the-box relative period calculations to solve it.
Essentially, we had a customer who wanted to see a metric for every month, the previous month’s value next to it, and lastly the difference between the two.
To do this in SAS Visual Analytics, which is available in versions 7.3 and above, use the relative periodic operators. I am going to use the Mega_Corp data which has a date data item called Date by Month using the format: MMMYYYY. SAS Visual Analytics supports relative period calculations for month, quarter and year.
The first two columns, circled in red, are straight from the data. The metric we are interested in for this report is Profit.
Next, we will create the last column, Profit (Difference from Previous Period), which is an aggregated measure that uses the periodic operators.
From the Data pane, select the metric used in the list table, Profit. Then right-click on Profit and navigate the menus: Create / Difference from Previous Period / Using: Date by Month.
A new aggregated measure will be created for you:
If you right-click on the aggregated measure and select Edit Aggregated Measure…, you will see this relative period calculation, where it is taking the current period (notice the 0) minus the value for the previous period (notice the -1).
Okay – that’s it. This out-of-the-box relative period calculation is ready to be added to the list table. Notice the other Period Operators available in the list. These support SAS Visual Analytics’ additional out-of-the-box aggregated measure calculations such as the Difference between Parallel Periods, Year to Date cumulative calculations, etc.
Now we have to create the final column to meet our report requirement: the Previous Period column.
To do this we are going to leverage the out-of-the-box functionality of the relative period calculation. Since this aggregated measure calculates the previous period for the subtraction – let’s use this to our advantage.
Duplicate the out-of-the-box relative period calculation by right-clicking on Profit (Difference from Previous Period) and select Duplicate Data Item.
Then right-click on the new data item, and select Edit Aggregated Measure….
Now delete everything highlighted in yellow below, remember to also delete the minus sign. And give the data item a new name. Click OK. This will create an aggregated measure that will calculate the previous period.
The final result should look like this from either the Visual tab or Text tab:
Now we have all the columns to meet our report requirement:
Now that I’ve piqued your interest, I’m sure you are wondering if you could use this technique to create aggregated data items to represent the Period -1, -2, -3 offset? YES! This is absolutely possible.
Also, I went ahead and plotted the Difference from Previous Period on a line chart. This is an extremely useful visualization to gage if the variance between periods is acceptable. You can easily assign display rules to this visualization to flag any periods that may need further investigation.
SAS Visual Analytics 7.4 has added the support for date parameters. Recall from my first post, Using parameters in SAS Visual Analytics, a parameter is a variable whose value can be changed at any time by the report viewer and referenced by other report objects. These objects can be a calculated item, aggregated measure, filter, rank or display rule. And remember, every time the parameter is changed, the corresponding objects are updated to reflect that change.
Here is my updated table that lists the supported control objects and parameter types for SAS Visual Analytics 7.4. The type of parameter is required to match the type of data that is assigned to the control.
Notice that SAS Visual Analytics 7.4 has also introduced the support for multiple value selection control objects. I’ll address these in another blog.
Let’s look at an example of a SAS Visual Analytics Report using date parameters. In this fictitious report, we have been given the requirements that the user wants to pick two independent date periods for comparison. This is not the same requirement as filtering the report between a start and end date. This report requirement is such that a report user can pick two independent months in the source data to be able to analyze the change in Expense magnitude for different aggregation levels, such as Region, Product Line and Product.
In this example, we will compare two different Month,Year periods. This could easily be two different Quarter,Year or Week,Year periods; depending on the report requirements, these same steps can be applied.
In this high level breakdown, you can see in red I will create two date parameters from data driven drop-down lists. From these parameter values, I will create two calculated data items, shown in purple, and one aggregated measure that will be used in three different report objects, shown in green.
Here are the steps:
1. Create the date parameters.
2. Add the control objects to the report and assign roles.
3. Create the dependent data items, i.e. the calculated data items and aggregated measure.
4. Add the remaining report objects to the canvas and assign roles.
Step 1: Create the date parameters
First we will need to create the date parameters that will hold the values made by the report viewers. From the Data Pane, use the drop-down menu and select New Parameter….
Then create your first parameter as shown below. Give it a name.
Next, select minimum and maximum values allowed for this parameter. I used the min and max available in my data source, but you could select a more narrow range if you wanted to restrict the users to only have access to portions of the data, just so long as the values are in your data source since, in this example, we will use the data source to populate the available values in the drop-down list.
Then select a current value, this will serve as the default value that will populate when a user first opens the report.
Finally, select the format in which you want your data item to be formatted. I selected the same format as my underlying data item I will be using to populate the drop-down list.
Notice how your new parameters will now be available from your Data Pane.
Step 2: Add the control objects to the report and assign roles
Next, drag and drop the drop-down list control objects onto the report canvas. In this example, we are not using the Report or Section Prompt areas since I do not want to filter the objects in the report or section automatically. Instead, I am using these prompt values and storing them in a parameter. I will then use those values to create new calculated data items and an aggregated measure.
Once your control objects are in the report canvas, then use the Roles Pane to assign the data items to the roles. As you can see from the screenshot, we are using the Date by Month data item to seed the values of the drop-down list by assigning it to the Category role, this data item is in our data source.
Then we are going to assign our newly created parameters, Period1Parameter and Period2Parameter to the Parameter role. This will allow us to use the value selected in our calculations.
Step 3: Create the dependent data items, i.e. the calculated data items and aggregated measure
Now we are free to use our parameters as we like. In this example, I am prompting the report viewer for two values: Period 1 and Period 2 which are the two periods the user would like compared in this report. So, we will need to create two calculated data times from a single column in our source data. Since we want to display these as columns next to each other in a crosstab object and use them for an aggregated measure, this technique can be used.
Calculated Data Item: Period 1 Expenses
From the Data Pane, use the drop-down menu and select New Calculated Item…. Then use the editor to create this expression: If the Date by Month for this data row equals the parameter value selected for Period 1, then return the Expenses; else return 0.
Calculated Data Item: Period 2 Expenses
Repeat this using the Period2Parameter in the expression.
Aggregated Measure: Period Difference
Next, we want to calculate the difference between the two user selected Period Expenses. To do this, we will need to create an aggregated measure which will evaluate based on the report object’s role assignments. In other words, it will be calculated “on-the-fly” based on the visualization.
Similar to the calculated data items, use the Data Pane and from the drop-down menu select New Aggregated Measure…. Use the editor to create this expression. Notice that we are using our newly created calculated data items we defined using the parameter values. This expression does not use the parameter value directly, but indirectly through the calculated data item.
Step 4: Add the remaining report objects to the canvas and assign roles
No that we have:
- our Control Objects to capture the user input.,
- the Parameters to store the values.,
- and the Calculated Data Items and Aggregated Measure created…
we can add our report objects to the canvas and assign our roles.
You can see I used all three new measures in the crosstab object. I used the aggregated measure in the bar chart and treemap but notice the different aggregation levels. There is even a hierarchy assigned to the treemap category role. This Period Difference aggregated measure calculation is done dynamically and will evaluate for each visualization with its unique role assignments, even while navigating up and down the hierarchy.
Here are some additional screenshots of different period selections.
In this first screenshot you can see the parallel period comparison between December 2010 and 2011.
In these next two screenshots, we are looking at the Thanksgiving Black Friday month of November. We are comparing the two years 2010 and 2011 again. Here we see that the Board Product from the Game Product Line is bright blue indicating an increase in magnitude of Expenses in the most recent period, Nov2011.
By double clicking on Board in the treemap, we are taken to the next level of the hierarchy, Product Description, where we see a the largest magnitude of Expenses is coming from Backgammon and Bob Board Games.
In these final two screenshots we are comparing consecutive periods, November 2011 with December 2011. We can see from the bar chart easily the Region and Product Line where there is the greatest increase in Expenses.
I’ve configured a brush interaction between all three visualizations so that when I select the tallest bar it will highlight the corresponding data values in the crosstab and treemap.
Now you can use date parameters in your Visual Analytics Reports. There are several applications of this feature and this is only one way you can use parameters to drive business intelligence. Using this technique to create columns based on a user selected value is great when you need to compare values when your source data isn’t structured in this manner.
Using Date Parameters in your SAS Visual Analytics Reports was published on SAS Users.
For many years the humble spreadsheet has held many different roles and responsibilities supporting finance, marketing, sales -- pretty much every department in your business. There's always someone with a “magic spreadsheet,” but how effective is this culture that always uses the same format to consume data? My view of [...]
Earth is an explosive world! Data from the Smithsonian Institution's Global Volcanism Program (GVP) documents Earth's volcanoes and their eruptive history over the past 10,000 years. The GVP database includes the names, locations, types, and features of more than 1,500 volcanoes. Let's look closer into volcanic eruptions across the globe [...]
How many meteorites have hit the earth in the last 4,000 years? Where have they landed? And which ones were the biggest? Can we show all of this information - and more in an intuitive data visualization? It turns out NASA provides public data about recorded meteorite impacts on earth all the [...]
How to design a meteorite infographic using NASA data and SAS was published on SAS Voices by Falko Schulz