The U.S. was really fortunate in having the recent total solar eclipse pass through so many of its states! This gave lots of people an opportunity to see it, with just a short (or moderate) drive. I think a little kid spoke for all of us when they said "Let's [...]
The Payment Services Directive 2 (PSD2) is a new European-wide regulation that requires European banks to make it easier to share customer transaction and account data (where the customer has given their consent) with third party providers, and it's the current hot topic across the payments industry in Europe. As [...]
What’s your favorite time of day? For me, it’s the morning with a cup of coffee, just as the sun starts to appear on the horizon, emitting a pleasant warm glow. My least favorite part of the day? Mornings without coffee when that same big ball of light makes a [...]
The post Learn neural networks and more at Analytics Experience 2017 appeared first on SAS Learning Post.
I have been working in the field of Professional Certification and Licensure for roughly 10 years. In that time, I worked with and observed close to 40 IT exam before coming to SAS. There has been a recurring question in each engagement. It is often phrased differently, but means mostly [...]
Your business is in a battle to retain and increase your share of wallet from a happy customer base. Winning takes analytics-driven real-time customer experiences. If you’re unconvinced of the value that using analytics brings to the customer experience, a recent Forbes Insight survey of nearly 360 executives makes a compelling case, with benefits including:
- Faster decision-making: 62 percent.
- Better insight into a common view of customers: 51 percent.
- Greater confidence in decision-making: 49 percent.
- Greater engagement with customers: 49 percent.
- Increased sales revenue: 47 percent.
- More repeat business from customers: 44 percent.
With these advantage-building benefits you can imagine the risk to your bottom line from saying “no” to taking an analytical approach to delivering customer experience: Forrester puts it at a 14 percent missed growth opportunity.
What consumers want
All modern consumers are transitioning from digital-first into digital-only users – and they expect you and every other business to achieve “digital parity”. In other words, your organisation needs to match or exceed the best experiences your customers have had with other organisations online or in self-serve environments. The result is that expectations of brand experiences are higher than ever in this data-driven world.
Why an analytical approach delivers against consumer expectations
Over the last decade, a plethora of technologies designed to help us better address our customers has come and gone. For consumers, the result has been fragmented and inconsistent experiences and marketers found themselves failing to deliver a stellar customer experience.
An analytical approach to customer experience enables organisations to move from simply reacting to customers in the moment, to predicting those moments and the appropriate outcomes in advance, to deliver a more considered and strategic experience. Integrated analytics and decisioning is the only way to balance an understanding of a customer’s attitudes, preferences, interests and needs with customer lifetime value, propensity and risk. This approach allows you to make accurate and profitable decisions about the right content, the right offer, the right price or the right product in the moment a customer engages with you. In real-time environments, all this decision-making must be done in milliseconds – something a predetermined, rules-based approach cannot deliver.
How can you capitalise on the real-time opportunity?
At the SAS Data and Customer Experience Forum, the data strategy and innovation director at a leading European broadcaster said that the key to improving customer experience was by “improving all of the small decisions that are made by organisations when they interact with customers.”
Small decisions are those made in response to an individual customer’s choices. Focusing on small decisions offers benefits in many ways:
- Improving risk management and matching price to risk.
- Reducing or eliminating fraud and waste.
- Increasing revenue by making the most of every opportunity.
- Improving the utilisation of constrained resources across the organization.
The result is a superior customer experience.
By deploying a real-time decisioning framework, this leading broadcaster has experienced a significant sales uplift through online cross-sell and upsell activities. Their customer retention team has also used this approach to significantly reduce retention costs across the same number of customers.
Organisations need to adapt from making decisions at organizational speed to making decision at customer speed. For instance, if a customer ignores an offer online, they should be served a more suitable alternative within seconds based on all of their individual data including their lifetime value, propensity and attitudes, as well as new contextual information (e.g., their location, the device they’re using, etc.),
For Shop Direct, a UK-based digital retailer, the use of analytics to deliver a superior customer experience has paid off with a 43 percent increase in pre-tax profits. CEO Alex Baldock, said, "We're making the most of how well we know our customer and being increasingly tailored to each of them."
Baldock admits personalisation needs to be executed effectively and that crude personalisation can be damaging to the customer relationship. "If a customer buys a pair of slippers and we just target them with slippers for the next year, that could be worse than not personalising at all." (source: essentialretail.com)
Are you ready to take a more analytical approach to customer experience? SAS helps many organisations start their journeys. Read more about our approach to real-time customer experience and how we can help you become a customer-first organisation, too.
Become a Customer First enterprise read:
There is an expression in the military that goes something like this: Military leaders train and prepare for the next war using available information, which typically means they're planning for the last (previous) war -- and that's a recipe for disaster. So what does this have to do with utilities [...]
All statisticians are familiar with the classical arithmetic mean. Some statisticians are also familiar with the geometric mean. Whereas the arithmetic mean of n numbers is the sum divided by n, the geometric mean of n nonnegative numbers is the n_th root of the product of the numbers. The geometric mean is always less than the arithmetic mean.
Did you know that there is a hybrid quantity, called the arithmetic-geometric mean, which is defined by combining the two quantities? The arithmetic-geometric mean is only defined for two positive numbers, x and y. It is defined as the limit of an alternating iterative process:
- Define a1 = (x + y)/2 and g1 = sqrt(x y) to be the arithmetic and geometric means, respectively.
- Iteratively define an+1 = (an + gn)/2 and gn+1 = sqrt(an gn).
Interestingly, this process always converges. The number that it converges to is called the arithmetic-geometric mean of x and y, which is denoted by AGM(x,y). The AGM is between the geometric mean and the arithmetic mean. I am not aware of a multivariate generalization of the AGM function.
In SAS you can use the SAS/IML language or PROC FCMP to create a user-defined function that computes the arithmetic-geometric mean. Since the AGM is not a vector operation, I show the PROC FCMP approach:
proc fcmp outlib=work.funcs.MathFuncs; function AGM(x, y); if x<0 | y<0 then return( . ); epsilon = 1e-10; /* precision of computation */ a = mean(x, y); g = geomean(x,y); do while (abs(a - g) > epsilon); a_new = mean(a, g); g_new = geomean(a, g); a = a_new; g = g_new; end; return( mean(a, g) ); endsub; quit; /* test the function */ options cmplib=work.funcs; /* define location of function */ data _NULL_; agm = AGM(1, 0.8); put agm 18.14; run;
An example of calling the new AGM function is shown for the two numbers 1 and 0.8. The arithmetic mean of 1 and 0.8 is 0.9. The geometric mean of 1 and 0.8 is sqrt(0.8) = 0.8944. The computation shows that the arithmetic-geometric mean is 0.8972, which is between the arithmetic and geometric means.
The following program computes the graph of the AGM function and displays a contour plot of the result. Note that AGM(x, x) = x for any nonnegative value of x.
data grid; do x = 0 to 5 by 0.1; do y = 0 to 5 by 0.1; agm = AGM(x, y); output; end; end; run; /* see http://blogs.sas.com/content/iml/2012/07/02/create-a-contour-plot-in-sas.html */ proc sgrender data=grid template=ContourPlotParm; dynamic _TITLE="Graph of Arithmetic-Geometric Mean AGM(x,y)" _X="x" _Y="y" _Z="AGM"; run;
So, what is the arithmetic-geometric mean good for? The AGM seems mainly useful in applied mathematics and number theory for computing elliptic functions and elliptic integrals. I am not aware of any applications to statistics, perhaps because the AGM is not defined for a sample that contains n elements when n > 2.
Although the AGM function is not used in statistics, the iterative algorithm that defines the AGM is a technique that I have seen often in computational statistics. When a researcher wants two conditions to hold simultaneously, one possible method is to alternately solve for each condition and prove that this iterative process converges to a solution that satisfies both conditions simultaneously. The iterative process can be an effective way to implement an optimization problem if each sub-optimization is simple.
In multivariate statistics, this technique is used in the method of alternating least squares, which is used in several SAS procedures such as PROC TRANSREG, VARCLUS, and MDS. In linear algebra, this technique is used in the method of alternating projections. I have used this technique to compute the closest correlation matrix to an arbitrary matrix.One last interesting fact. There is another kind of mean called the harmonic mean. You can compute the arithmetic-harmonic mean by applying the same iterative algorithm, but replace the GEOMEAN function with the HARMEAN function. The process converges and the arithmetic-harmonic mean converges is equal to the geometric mean! Thus although this two-step definition might seem contrived, it is "natural" in the sense that it produces the geometric mean (of two numbers) from the arithmetic and harmonic means.
World Wildlife Fund (WWF) invests in education and training to create the next generation of conservation leaders from the around the world. These leaders go on to have a powerful influence on the fate of the world’s most biodiverse and vulnerable places. While scores of leaders benefit from WWF’s support, [...]
The stored compiled macro facility enables you to compile and save your macro definition in a permanent catalog in a library that you specify. The macro is compiled only once. When you call the macro in the current and subsequent SAS® sessions, SAS executes the compiled code from the macro catalog that you created when you compiled the macro.
The stored compiled facility has two main purposes. The first is that it enables your code to run faster because the macro code does not need to be compiled each time it is executed. The second purpose is to help you protect your code. Sometimes you need to share code that you’ve written with other users, but you do not want them to be able to see the code that is being executed. The stored compiled macro facility enables you to share the program without revealing the code. Compiling the macro with the SECURE option prevents the output of the SYMBOLGEN, MPRINT, and MLOGIC macro debugging options from being written to the log when the macro executes. This means that no code is written to the log when the code executes. After the macro has been compiled, there is no way to decompile it to retrieve the source code that created the catalog entry. This behavior prevents the user from being able to retrieve the code. However, it also prevents you from being able to recover the code.
It is very important to remember that there is no way to get back the code from a stored compiled macro. Because of this behavior, you should ALWAYS save your code when creating a stored compiled macro catalog. In order to update a stored compiled macro, you must recompile the macro. The only way to do this is to submit the macro definition again. Another important fact is that a stored compiled macro catalog can be used only on the same operating system and release of SAS that it was created on. So, in order to use a stored compiled macro on another operating system or release of SAS, that macro must be compiled in the new environment. Again, the only way to compile the macro is to resubmit the macro definition.
Save the Macro Source Code
To make it easier for you to save your code, the %MACRO statement contains the SOURCE option. When you create a stored compiled macro, the SOURCE option stores the macro definition as part of a catalog entry in the SASMACR catalog in the permanent SAS library listed on the SASMSTORE= system option.
Here is the syntax needed to create a stored compiled macro with the SOURCE option set:
libname mymacs 'c:\my macro library'; ❶ options mstored sasmstore=mymacs; ❷ %macro test / store source; ❸ libname mylib1 'path-to-my-first-library'; libname mylib2 'path-to-my-second-library'; %mend;
❶ The LIBNAME statement points to the SAS library that will contain my stored compiled macro catalog.
❷ The MSTORED system option enables the stored compiled facility. The SASMSTORE= option points to the libref that points to the macro library.
❸ The STORE option instructs the macro processor to store the compiled version of TEST in the SASMACR catalog in the library listed in the SASMSTORE= system option. The SOURCE option stores the TEST macro definition in the same SASMACR catalog.
Note that the contents of the SASMACR catalog do not contain an entry for the macro source. The source has been combined with the macro entry that contains the compiled macro. To verify that the source has been saved, add the DES= option to the %MACRO statement. The DES= option enables you specify a description for the macro entry in the SASMACR catalog. So for example, you could add the following description when compiling the macro to indicate that the source code has been saved:
%macro test / store source des=’Source code saved with entry’;
You can look at the contents of the macro catalog using the CATALOG procedure:
proc catalog cat=a.sasmacr; contents; run; quit;
You see the description indicating that the source code was saved with the macro entry in the output from PROC CATALOG:
Retrieve the Macro Source Code
When you need to update the macro or re-create the catalog on another machine, you can retrieve the macro source code using the %COPY statement. The %COPY statement enables you to retrieve the macro source code and write the code to a file. Here is the syntax:
%copy test / source outfile='c:\my macro library\test.sas';
This %COPY statement writes the source code for the TEST macro to the TEST.SAS file. Using TEST.SAS, you are now able to update the macro or compile the macro on another machine.
Remember, you should always save your source code when creating a stored compiled macro. Without the source code, you will not be able to update the macro or move the macro to a new environment.
Here are the relevant links for this article:
Always save your code when creating a stored compiled macro was published on SAS Users.
Get on with your day faster by taking a self-service approach to data preparation.
The post Self-service data preparation gives fast access to the data you need appeared first on The Data Roundtable.