6月 192017
 

Most numerical optimization routines require that the user provide an initial guess for the solution. I have previously described a method for choosing an initial guess for an optimization, which works well for low-dimensional optimization problems. Recently a SAS programmer asked how to find an initial guess when there are linear constraints and bounds on the parameters. There are two simple approaches for finding an initial guess that is in the feasible region. One is the "shotgun" approach in which you generate many initial guesses and evaluate each one until you find one that is feasible. The other is to use the NLPFEA subroutine in SAS/IML, which takes any guess and transforms it into a feasible point in the linearly constrained region.

The NLPFEA subroutine in SAS/IML software

The NLPFEA routine returns a point in the feasible region from an arbitrary starting guess. Suppose the problem has p (possibly bounded) parameters, and the feasible region is formed by k > 0 additional linear constraints. Then you can represent the feasible regions by a (k+2) x (p+2) matrix, which is the representation that is used for linear programming and constrained nonlinear optimizations. The first row specifies the lower bounds for the parameters and the second row specifies the upper bounds. (Missing values in the first p columns indicate that a parameter is not bounded.) The remaining k rows indicate linear constraints. For example, the following matrix defines a pentagonal region for two parameters. You can call the NLPFEA subroutine to generate a point that satisfies all constraints:

proc iml;
con = {  0   0   .   .,    /* param min */
        10  10   .   .,    /* param max */
         3  -2  -1  10,    /* 3*x1 + -2*x2 LE 10 */
         5  10  -1  56,    /* 5*x1 + 10*x2 LE 56 */
         4   2   1   7 };  /* 4*x1 +  2*x2 GE  7 */
 
guess = {0 0};                /* arbitrary p-dimensional point */
call nlpfea(z, guess, con);  /* x0 is feasible point */
print guess[c={"x" "y"}], z[c={"Tx" "Ty"}];

The output shows that the guess (x,y) = (0,0) was not feasible, but the NLPFEA routine generated the transformed point T(x,y) = (1.2, 1.1), which is feasible.

It is interesting to visualize the NLPFEA subroutine. The following SAS/IML statements create 36 initial guesses that are distributed uniformly on a circle around the feasible region. For each guess, the program transforms the guess into the feasible region by calling the NLPFEA subroutine. The initial and transformed points are saved to a SAS data set and visualized by using PROC SGPLOT:

NumPts = 36;
twopi = 2*constant('pi');
x=.; y=.; Tx=.; Ty=.;
create feasible var {"x" "y" "Tx" "Ty"};
do i = 1 to NumPts;
   x = 2.5 + 5*cos((i-1)/NumPts * twopi);  /* guess on circle */
   y = 2.5 + 5*sin((i-1)/NumPts * twopi);
   call nlpfea(feasPt, x||y, con);         /* transform into feasible */
   Tx = feasPt[1]; Ty = feasPt[2];         
   append;
end;
close;
Transformation of points into a feasible region

The graph visualizes the result. The graph shows how each point on the circle is transformed into the feasible region. Some points are transformed into the interior, but many are transformed onto the boundary. You can see that the transformed point always satisfies the linear constraints.

SAS/IML automatically finds feasible points

Before I finish, I want to point out that the nonlinear programming (NLP) subroutines in SAS/IML software rarely require you to call the NLPFEA subroutine explicitly. When you call an NLP routine for a linearly constrained optimization and provide a nonfeasible initial guess, the NLP routine internally calls the NLPFEA routine. Consequently, you might see the following NOTE displayed in the SAS log: NOTE: Initial point was changed to be feasible for boundary and linear constraints. For example, run the following program, which provides a nonfeasible initial guess to the NLPNRA (Newton-Raphson) subroutine.

start func(x);
   x1 = x[,1];   x2 = x[,2];
   return ( -(x1-3)##4 + -(x2-2)##2 + 0.1*sin(x1#x2));
finish;
 
opt = {1,    /* find maximum of function     */
       3};   /* print a little bit of output */
x0 = {0 0};
call nlpnra(rc, x_Opt, "func", x0, opt) blc=con;

In this program, the NLPNRA subroutine detects that the guess not feasible. It internally calls NLPFEA to obtain a feasible guess and then computes an optimal solution. This is very convenient for programmers. The only drawback is that you don't know the initial guess that produced the optimal solution. However, you can call the NLPFEA subroutine directly if you want to obtain that information.

Summary

In optimization problems that have linear and boundary constraints, most optimization routines require an initial guess that is in the feasible region. The NLPFEA subroutine enables you to obtain a feasible point from an arbitrary initial guess. You can then use that feasible point as an initial guess in a built-in or user-defined optimization routine. However, for the built-in NLP subroutines, you can actually skip the NLPFEA call because the NLP subroutines internally call NLPFEA when you supply a nonfeasible initial guess.

The post How to find a feasible point for a constrained optimization in SAS appeared first on The DO Loop.

6月 172017
 

It has been almost a year since then-U.S. CIO Tony Scott introduced the federal open source policy that called for agencies to share federally-developed software source code. The policy, more than anything, aimed to make agencies more agile. Instead of redeveloping the same programs the open source policy would allow [...]

How analytics and open source can improve government agility was published on SAS Voices by Trent Smith

6月 162017
 

Using parameters within the macro facilityHave you ever written a macro and wondered if there was an easy way to pass values to the macro? You can by using macro parameters. Macro parameters enable you to pass values into the macro at macro invocation, and set default values for macro variables within the macro definition. In this blog post, I also discuss how you can pass in a varying number of parameter values.

There are two types of macro parameters: positional and keyword.

Positional Parameters

You can use positional parameters to assign values based on their position in the macro definition and at invocation. The order that you use to specify the values must match the order in which they are listed in the %MACRO statement. When specifying multiple positional parameters, use a comma to separate the parameters. If you do not pass a value to the macro when it is invoked, a null value is assigned to the macro variable specified in the %MACRO statement.

Here is an example:

%macro test(var1,var2,var3);                                                                                                            
 %put &=var1;                                                                                                                           
 %put &=var2;                                                                                                                           
 %put &=var3;                                                                                                                           
%mend test;                                                                                                                             
 
/** Each value corresponds to the position of each variable in the definition. **/ 
/** Here, I am passing numeric values.                                         **/                                                            
%test(1,2,3)                                                                                                                            
/** The first position matches with var1 and is given a null value.            **/                                                             
%test(,2,3)                                                                                                                             
/** I pass no values, so var1-var3 are created with null values.               **/                                                             
%test()                                                                                                                                 
/** The first value contains a comma, so I use %STR to mask the comma.         **/                                                             
/** Otherwise, I would receive an error similar to this: ERROR: More           **/
/** positional parameters found than defined.                                  **/                                                             
%test(%str(1,1.1),2,3)                                                                                                                  
/** Each value corresponds to the position of each variable in the definition. **/ 
/** Here, I am passing character values.                                       **/                                                            
%test(a,b,c) 
/** I gave the first (var1) and second (var2) positions a value of             **/
/** b and c, so var3 is left with a null value.                                **/                                                             
%test(b,c)

 

Here are the log results:

173  /** Each value corresponds to the position of each variable in the definition. **/
174  /** Here, I am passing numeric values.                                         **/
175  %test(1,2,3)
VAR1=1
VAR2=2
VAR3=3
176  /** The first position matches with var1 and is given a null value.            **/                                                             
177  %test(,2,3)
VAR1=
VAR2=2
VAR3=3
 
178  /** I pass no values, so var1-var3 are created with null values.               **/
179  %test()
VAR1=
VAR2=
VAR3=
180  /** The first value contains a comma, so I use %STR to mask the comma.         **/                                                             
181  /** Otherwise, I would receive an error similar to this: ERROR: More           **/
182  /** positional parameters found than defined.                                  **/                                                             
183  %test(%str(1,1.1),2,3)
VAR1=1,1.1
VAR2=2
VAR3=3
184  /** Each value corresponds to the position of each variable in the definition. **/
185  /** Here, I am passing character values.                                       **/
186  %test(a,b,c)
VAR1=a
VAR2=b
VAR3=c
187  /** I gave the first (var1) and second (var2) positions a value of             **/
188  /** b and c, so var3 is left with a null value.                               **/
189  %test(b,c)
VAR1=b
VAR2=c
VAR3=

 

Keyword Parameters

The benefit of using keyword parameters is the ability to give the macro variables a default value within the macro definition. When you assign values using keyword parameters, you must include an equal sign after the macro variable name.

Here is an example:

%macro test(color=blue,id=123);                                                                                                         
 %put &=color;                                                                                                                          
 %put &=id;                                                                                                                             
%mend test;                                                                                                                             
 
/** Values passed to the macro overwrite default values from the definition. **/                                                                 
%test(color=red,id=456)                                                                                                                 
/** Passing in no values allows the default values to take precedence.      **/                                                                 
%test()                                                                                                                                 
/** You are not required to pass in a value for each keyword parameter.    **/                                                                 
%test(color=green)                                                                                                                      
/** The order of variables does not matter.                               **/                                                                                                 
%test(id=789,color=yellow)

 

Here are the log results:

270  /** Values passed to the macro overwrite default values from the definition. **/
271  %test(color=red,id=456)
COLOR=red
ID=456
272  /** Passing in no values allows the default values to take precedence.     **/
273  %test()
COLOR=blue
ID=123
274  /** You are not required to pass in a value for each keyword parameter.   **/
275  %test(color=green)
COLOR=green
ID=123
276  /** The order of variables does not matter.                              **/
277  %test(id=789,color=yellow)
COLOR=yellow
ID=789

 

If the macro definition combines positional and keyword parameters, positional parameters must come first. If you do not follow this order, this error is generated:

ERROR: All positional parameters must precede keyword parameters.

 

Here is an example:

%macro test(val,color=blue,id=123);                                                                                                     
 %put &=color;                                                                                                                          
 %put &=id;                                                                                                                             
 %put &=val;                                                                                                                            
%mend test;                                                                                                                             
 
/** The positional parameter is listed first. **/                                                                 
%test(1,color=red,id=456)
 
Here are the log results:
 
318  /** The positional parameter is listed first. **/                                                                 319  %test(1,color=red,id=456)
COLOR=red
ID=456
VAL=1

 

PARMBUFF

The PARMBUFF option creates a macro variable called &SYSPBUFF that contains the entire list of parameter values, including the parentheses. This enables you to pass in a varying number of parameter values. In the following example, you can pass any number of parameter values to the macro. This following example illustrates how to parse each word in the parameter list:

%macro makes/parmbuff; 
  /** The COUNTW function counts the number of words within &SYSPBUFF.            **/                                                                                                                 
   %let cnt=%sysfunc(countw(&syspbuff)); 
  /** The %DO loop increments based on the number of words returned to the macro. **/
  /** variable &CNT.                                                              **/                                
   %do i= 1 %to &cnt;  
  /** The %SCAN function extracts each word from &SYSPBUFF.                      **/                                                                                                                  
     %let make=%scan(&syspbuff,&i);                                                                                                     
     %put &make;                                                                                                                        
   %end;                                                                                                                                
%mend makes;                                                                                                                            
 
%makes(toyota,ford,chevy)

 

Here are the log results:

19  %macro makes/parmbuff;
20    /** The COUNTW function counts the number of words within &SYSPBUFF.            **/
21     %let cnt=%sysfunc(countw(&syspbuff));
22    /** The %DO loop increments based on the number of words returned to the macro  **/
23    /** variable &CNT.                                                              **/
24     %do i= 1 %to &cnt;
25    /** The %SCAN function extracts each word from &SYSPBUFF.                       **/
26       %let make=%scan(&syspbuff,&i);
27       %put &make;
28     %end;
29  %mend makes;
30
31  %makes(toyota,ford,chevy)
toyota
ford
chevy

 

When you specify the PARMBUFF option and the macro definition includes both positional and keyword parameters, the parameters still receive values when you invoke the macro. In this scenario, the entire invocation list of values is assigned to &SYSPBUFF. Here is an example:

%macro test(b,a=300)/parmbuff;                                                                                                      
 %put &=syspbuff;                                                                                                                        
 %put _local_;                                                                                                                          
%mend;                                                                                                                                  
 
%test(200,a=100)

 

Here are the log results:

SYSPBUFF=(200,a=100)
TEST A 100
TEST B 200

 

Notice that &SYSPBUFF includes the entire parameter list (including the parentheses), but each individual parameter still receives its own value.

If you need to know all the parameter values that are passed to the macro, specify the PARMBUFF option in the macro definition to get access to &SYSPBUFF, which contains all the parameter values. For more information about PARMBUFF, see %MACRO Statement in SAS® 9.4 Macro Language: Reference, Fifth Edition.

I hope this blog post has helped you understand how to pass values to a macro. If you have SAS macro questions that you would like me to cover in future blog posts, please comment below.

Using parameters within the macro facility was published on SAS Users.

6月 162017
 

Tiffany Carpenter, head of customer intelligence at SAS UK & Ireland, looks at the benefits of real-time customer experience and offers a preview into how analytics is powering hyper-personalised customer journeys

In recent years, customer experience has become an important battleground for brands. Yet, in a hyper-connected, hyper-competitive environment where it is becoming increasingly difficult to compete on product or price alone, the concept of customer experience has grown in importance as organisations fight to remain relevant and deliver against customer expectations.

Customers expect the organisations they are interacting with to make it easy to business with them. They expect a seamless experience regardless of how they engage with you whether it be online, via an app, a call centre or in person; and they expect their personal information and data that they have made available, to be used appropriately by organisations to deliver relevant experiences.  To deliver against these expectations,  businesses must first fully understand the wants and needs of current and prospective customers. While this may sound simple enough in principle, most organisations are only using a limited amount of data to try to understand their customers. In fact, most UK organisations admit to using less than half of the valuable data available to them, and they will often analyse it using basic tools or spreadsheets that fail to provide a single view of the customer.

Achieving a segment of one

What’s needed is an approach that allows organisations to concentrate on delivering a superior customer experience by achieving relevancy at every touchpoint based on an understanding of each individual customer – a segment of one.

Today’s customers want the call centre to know when they have just been on the website. They want brands to adjust their marketing strategies if they’ve  made a complaint or negatively reviewed a product or service For businesses, this means having access to a ‘central brain’ that can analyse of all the data available in a timely manner with the ability to inject that insight into any customer interaction across any department and channel -  in real-time if necessary.

This means using data about what’s already happened as well as what’s happening now, to predict what’s going to happen in the future, what the best outcomes will be and make profitable and accurate decisions at each point of a customer interaction.

The central brain

In the race to digitalisation, the mistake many businesses make when trying to achieve a segment of one is placing too much emphasis and narrow focus on digital data. Each lifecycle stage, across each channel is important – from initial consideration, to active evaluation, to the moment of purchase and even the post-purchase experience. Key to successful customer intelligence strategies is tying together offline and online data to get a better understanding of the customer.

Rather than analysing data from a single digital transaction or following customers around in a digital world, It’s more important to understand what happens prior, during and after a digital interaction to create a full picture of behavioural insights. To truly understand customer behaviour and deliver the most value at each customer touch point non-digital data such as demographic, psychographic, transactional, risk and many others types of data - that sit both outside and inside the digital environment - needs to be analysed and mapped to specific stages in the customer lifecycle.

More importantly, once businesses gain these insights, they need to consider how they use this insight to make the right decisions that deliver value to the business. Where appropriate those decisions need to be made in real time and injected into the customer interaction channel at the point of engagement. Each stage of the customer journey needs to be viewed as an opportunity to improve the customer experience. And each stage is an opportunity to gain more insight that can be fed back into marketing processes to draw from the next time. Only then can you deliver the right message at the right time via the right channel.

A personalised experience in real-time

Shop Direct is a great example of a business embracing this approach. Its goal was to make it easier for customers to shop with them, thereby improving the customer experience whilst increasing customer spend. As a 40-year-old business that started as a catalogue company, it was sitting on a huge amount of data that had been captured over the years about its customers and they wanted to find a way to use that data to deliver a highly personalised customer experience.

At the time, a customer shopping for jeans on their Very.co.uk website could be presented with 50 pages of options to scroll through. By analysing the existing data Shop Direct is now able to predict which jeans a customer is most likely to be interested in and personalise the customer’s shopping experience. This is done via an individually personalised sort order in real time to show the products they are most interested in first. Harnessing data and advanced analytics to deliver unparalleled levels of personalistion has seen Shop Direct’s profits surge by 43%.

Group CEO at Shop Direct, Alex Baldock, has said that the company is "all about making it easier for our customers to shop. That's why we're passionate about personalisation. We want to tailor everything for our customer; the shop they visit and how we engage with them - before, during and after they’ve shopped."

The survival factor

In the future, developing a superior customer experience will rely on understanding the balance between delivering the right decision in real-time and giving yourself time to make the right decision. It’s crucial to remember that not every decision about the customer experience needs to be managed in real-time. Organisations have huge amounts of data at their fingertips that they can use to predict and plan to shape products, services and messages.

However, there will be moments when a decision needs to be  made in real-time as to what the right content, message, offer or recommendation for an individual customer might be. This decision should not just be based on what area of a website a customer clicked on, or whether they liked your facebook page. To make accurate and profitable decisions requires insight into offline and online historical data. This must be coupled with real time contextual data as well as a clear understanding of business goals and objectives, and clarity around the predicted outcome of each possible decision. To achieve this, businesses must move away from a channel-specific approach with fragmented systems and rules and embrace a centralised analytical decisioning capability. This would have access to all relevant data, a centralised set of logic and rules, and be able to automate complex analytical decisions at scale and push those out to any channel across any business unit at the right time.

This will need to be what underpins the entire business; the organisations that get this right, will be the ones that survive.

For more insights into how analytics is powering today’s hyper-personalised customer journey, come along to the SAS Data and Customer Experience Forum where we will be announcing headline findings from new research exploring where UK businesses are on the journey to delivering a real-time customer experience.

Transforming the customer experience with analytics was published on Customer Intelligence Blog.

6月 162017
 

Tiffany Carpenter, head of customer intelligence at SAS UK & Ireland, looks at the benefits of real-time customer experience and offers a preview into how analytics is powering hyper-personalised customer journeys

In recent years, customer experience has become an important battleground for brands. Yet, in a hyper-connected, hyper-competitive environment where it is becoming increasingly difficult to compete on product or price alone, the concept of customer experience has grown in importance as organisations fight to remain relevant and deliver against customer expectations.

Customers expect the organisations they are interacting with to make it easy to business with them. They expect a seamless experience regardless of how they engage with you whether it be online, via an app, a call centre or in person; and they expect their personal information and data that they have made available, to be used appropriately by organisations to deliver relevant experiences.  To deliver against these expectations,  businesses must first fully understand the wants and needs of current and prospective customers. While this may sound simple enough in principle, most organisations are only using a limited amount of data to try to understand their customers. In fact, most UK organisations admit to using less than half of the valuable data available to them, and they will often analyse it using basic tools or spreadsheets that fail to provide a single view of the customer.

Achieving a segment of one

What’s needed is an approach that allows organisations to concentrate on delivering a superior customer experience by achieving relevancy at every touchpoint based on an understanding of each individual customer – a segment of one.

Today’s customers want the call centre to know when they have just been on the website. They want brands to adjust their marketing strategies if they’ve  made a complaint or negatively reviewed a product or service For businesses, this means having access to a ‘central brain’ that can analyse of all the data available in a timely manner with the ability to inject that insight into any customer interaction across any department and channel -  in real-time if necessary.

This means using data about what’s already happened as well as what’s happening now, to predict what’s going to happen in the future, what the best outcomes will be and make profitable and accurate decisions at each point of a customer interaction.

The central brain

In the race to digitalisation, the mistake many businesses make when trying to achieve a segment of one is placing too much emphasis and narrow focus on digital data. Each lifecycle stage, across each channel is important – from initial consideration, to active evaluation, to the moment of purchase and even the post-purchase experience. Key to successful customer intelligence strategies is tying together offline and online data to get a better understanding of the customer.

Rather than analysing data from a single digital transaction or following customers around in a digital world, It’s more important to understand what happens prior, during and after a digital interaction to create a full picture of behavioural insights. To truly understand customer behaviour and deliver the most value at each customer touch point non-digital data such as demographic, psychographic, transactional, risk and many others types of data - that sit both outside and inside the digital environment - needs to be analysed and mapped to specific stages in the customer lifecycle.

More importantly, once businesses gain these insights, they need to consider how they use this insight to make the right decisions that deliver value to the business. Where appropriate those decisions need to be made in real time and injected into the customer interaction channel at the point of engagement. Each stage of the customer journey needs to be viewed as an opportunity to improve the customer experience. And each stage is an opportunity to gain more insight that can be fed back into marketing processes to draw from the next time. Only then can you deliver the right message at the right time via the right channel.

A personalised experience in real-time

Shop Direct is a great example of a business embracing this approach. Its goal was to make it easier for customers to shop with them, thereby improving the customer experience whilst increasing customer spend. As a 40-year-old business that started as a catalogue company, it was sitting on a huge amount of data that had been captured over the years about its customers and they wanted to find a way to use that data to deliver a highly personalised customer experience.

At the time, a customer shopping for jeans on their Very.co.uk website could be presented with 50 pages of options to scroll through. By analysing the existing data Shop Direct is now able to predict which jeans a customer is most likely to be interested in and personalise the customer’s shopping experience. This is done via an individually personalised sort order in real time to show the products they are most interested in first. Harnessing data and advanced analytics to deliver unparalleled levels of personalistion has seen Shop Direct’s profits surge by 43%.

Group CEO at Shop Direct, Alex Baldock, has said that the company is "all about making it easier for our customers to shop. That's why we're passionate about personalisation. We want to tailor everything for our customer; the shop they visit and how we engage with them - before, during and after they’ve shopped."

The survival factor

In the future, developing a superior customer experience will rely on understanding the balance between delivering the right decision in real-time and giving yourself time to make the right decision. It’s crucial to remember that not every decision about the customer experience needs to be managed in real-time. Organisations have huge amounts of data at their fingertips that they can use to predict and plan to shape products, services and messages.

However, there will be moments when a decision needs to be  made in real-time as to what the right content, message, offer or recommendation for an individual customer might be. This decision should not just be based on what area of a website a customer clicked on, or whether they liked your facebook page. To make accurate and profitable decisions requires insight into offline and online historical data. This must be coupled with real time contextual data as well as a clear understanding of business goals and objectives, and clarity around the predicted outcome of each possible decision. To achieve this, businesses must move away from a channel-specific approach with fragmented systems and rules and embrace a centralised analytical decisioning capability. This would have access to all relevant data, a centralised set of logic and rules, and be able to automate complex analytical decisions at scale and push those out to any channel across any business unit at the right time.

This will need to be what underpins the entire business; the organisations that get this right, will be the ones that survive.

For more insights into how analytics is powering today’s hyper-personalised customer journey, come along to the SAS Data and Customer Experience Forum where we will be announcing headline findings from new research exploring where UK businesses are on the journey to delivering a real-time customer experience.

Transforming the customer experience with analytics was published on Customer Intelligence Blog.

6月 162017
 
keras vgg16 vgg19 resnet inception模型对比
# -*- coding: utf-8 -*-
# @DATE    : 2017/6/14 14:15
# @Author  : 
# @File    : classify_image.py

import os

import tensorflow as tf

from keras.applications import ResNet50
from keras.applications import InceptionV3
from keras.applications import Xception # TensorFlow ONLY
from keras.applications import VGG16
from keras.applications import VGG19
from keras.applications import imagenet_utils
from keras.applications.inception_v3 import preprocess_input
from keras.preprocessing.image import img_to_array
from keras.preprocessing.image import load_img
import numpy as np


MODEL =  {
    "vgg16": VGG16,
    "vgg19": VGG19,
    "inception": InceptionV3,
    "xception": Xception,
    "resnet": ResNet50
}

def load_preprocess_image(model_name, image_path):
    # vgg resnet
    input_shape = (224, 224)
    preprocess_func = imagenet_utils.preprocess_input

    if model_name in ["inception", "xception"]:
        input_shape = (299, 299)
        preprocess_func = preprocess_input

    # load image
    image = load_img(image_path, target_size=input_shape)

    # preprocess image
    image = img_to_array(image)
    image = np.expand_dims(image, axis=0)
    image = preprocess_func(image)

    return image


def image_model(model_name):
    network = MODEL[model_name]
    network = network(weights="imagenet")
    return network

def classify_image(model, image):
    preds = model.predict(image)
    preds = imagenet_utils.decode_predictions(preds)
    return preds


if __name__ == "__main__":
    data_dir = "data"
    image_names = ["ball.jpg"]
    image_paths = [ os.path.join(data_dir, image_name)  for image_name in image_names]
    model_names = ["vgg16", "vgg19", "resnet", "inception", "xception"]
    for image_path in image_paths:
        print("Image: {}".format(os.path.split(image_path)[-1]))
        for model_name in model_names:
            image = load_preprocess_image(model_name, image_path)
            model = image_model(model_name)
            preds = classify_image(model, image)
            print("Model: {}".format(model_name))
            print("Results(Top 5): ")
            for (i, (imagenetID, label, prob)) in enumerate(preds[0]):
                print("{}. {}: {:.2f}%".format(i + 1, label, prob * 100))


运行日志:



Using TensorFlow backend.
Image: ball.jpg
2017-06-16 19:50:03.635390: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use SSE4.2 instructions, but these are available on your machine and could speed up CPU computations.
2017-06-16 19:50:03.635405: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use AVX instructions, but these are available on your machine and could speed up CPU computations.
2017-06-16 19:50:03.635408: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use AVX2 instructions, but these are available on your machine and could speed up CPU computations.
2017-06-16 19:50:03.635413: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use FMA instructions, but these are available on your machine and could speed up CPU computations.
Model: vgg16
Results(Top 5): 
1. soccer_ball: 93.73%
2. rugby_ball: 5.88%
3. volleyball: 0.13%
4. golf_ball: 0.11%
5. tennis_ball: 0.04%
Model: vgg19
Results(Top 5): 
1. soccer_ball: 98.07%
2. rugby_ball: 1.24%
3. golf_ball: 0.32%
4. volleyball: 0.20%
5. croquet_ball: 0.07%
Model: resnet
Results(Top 5): 
1. soccer_ball: 99.54%
2. rugby_ball: 0.39%
3. volleyball: 0.06%
4. running_shoe: 0.00%
5. football_helmet: 0.00%
Model: inception
Results(Top 5): 
1. soccer_ball: 99.88%
2. volleyball: 0.11%
3. rugby_ball: 0.00%
4. sea_urchin: 0.00%
5. silky_terrier: 0.00%
Model: xception
Results(Top 5): 
1. soccer_ball: 90.85%
2. volleyball: 2.48%
3. rugby_ball: 1.37%
4. balloon: 0.11%
5. airship: 0.10%

参考:http://www.pyimagesearch.com/2017/03/20/imagenet-vggnet-resnet-inception-xception-keras/






 
 Posted by at 7:47 下午
6月 162017
 
Chinese-Character-Recognition https://github.com/soloice/Chinese-Character-Recognition
ImageNet: VGGNet, ResNet, Inception, and Xception with Keras http://www.pyimagesearch.com/2017/03/20/imagenet-vggnet-resnet-inception-xception-keras/
tensorflow large scale input 处理 参考cifar10 代码 https://github.com/tensorflow/models/blob/master/tutorials/image/cifar10/cifar10_input.py
A Survey on Deep Learning in Medical Image Analysis https://arxiv.org/pdf/1702.05747.pdf
http://www.jeyzhang.com/understanding-lstm-network.html
Awesome Deep learning papers and other resources https://github.com/endymecy/awesome-deeplearning-resources
Multi-Scale Convolutional Neural Networks for Time Series Classification https://arxiv.org/pdf/1603.06995v4.pdf
lstm时间序列预测 https://github.com/RobRomijnders/LSTM_tsc
CNN时间序列预测 http://robromijnders.github.io/CNN_tsc/

 
 Posted by at 7:45 下午
6月 152017
 

Regulations, corporate drivers, leadership and market influences have combined to produce a patchwork of uneven progress on initiatives such as distributed generation, customer choice, asset optimization and the industrial Internet of Things. These initiatives all rely on analytics to gain the most return on investment. To better understand organizational readiness [...]

Utility analytics in 2017: Aligning data and analytics with business strategy was published on SAS Voices by Mike F. Smith