1月 222010
 
You have probably heard the news by now. SAS is ranked as the #1 best place to work by Fortune Magazine. You can read more about why SAS is ranked #1.


But here's my story
I have worked at SAS for a long time. Some of my best friends work here. We met in the gym or the break room or a conference room.Some I even met at local stores and restaurants. Over the years, we have worked together, played together, grieved together, and celebrated together. The people are great. If you ask a SAS employee to tell you why they like working here, most say healthcare, daycare, and the people. I'll agree with all of those things. But let me tell you about yesterday and you'll understand why I really like my job at SAS.

We're ready to make some updates to support.sas.com/community. We want these updates to be innovative, useful, and fun. So I scheduled a brainstorming session over lunch (yes, over lunch) and invited colleagues from around the company to come and participate. Ok, here's the good part: 17 people gave up their lunch hour to discuss ideas that can enhance how our customers work with our Web site, with us as a company, and with each other. It was exciting.

And that is why I think SAS is a great place to work -- the number of work days that offer something to be excited about outnumber the days that don't.

I hope that you find something at work to be excited about this week. It sure makes working more fun.
1月 162010
 
I hate shopping. Going to the local mall is a form of torture for me. But send me to a virtual store, and I’ll happily browse online and likely place an order. Now I can do the same thing, of sorts, with SAS Global Forum since the presentation schedules and abstracts are posted online. I can just “shop” for what interests me most.

With over 375 presentations to choose from, planning ahead is the smart way for me to manage my time at SGF. Want to go shopping too? Just use the Personal Agenda Builder - it’s your starting point for exploring all that SAS Global Forum 2010 has to offer. Currently, it has the latest up-to-date information on the papers, posters and workshops. The information, including abstracts, may be viewed by Date, Company, Industry, Paper Type, Section, Skill Level or Speaker. An ad-hoc search will work too. Happy “shopping,” searching and planning!

技术博客重新开张

 人物, 技术博客, 统计之都, 统计备忘录, 胡说  技术博客重新开张已关闭评论
1月 042010
 

把以前在space写的文字都导入到这个新博客里了。

这新得白花花扎眼的一年,还想多写些关于SAS程序员本身的文字,关于这个职业,它依托的行业环境等等。SAS程序员在国内还不是一个很兴盛的职业。

还会有关于SAS本身的文字,关于SAS语言,SAS公司,关于它的创始人等等。最近我对SAS的创始人Tony Barr比较感兴趣。

技术本身,这个跟饭碗相关,除了SAS技术,很多笔墨可能会停留在CDISC上面。当然还会有自个兴之所至的其他文字,才年初呢,啥都没定。作为跟“统计之都”的约定,所有跟统计相关的文字,我会首先发布到“统计之都”,然后在自个的博客做个备份:

http://cos.name/author/hujiangtang/

1月 022010
 
********************************************************;
* A SAS MACRO IMPORTING SQLITE DATA TABLE WITHOUT ODBC *;
* ---------------------------------------------------- *;
* REQUIREMENT:                                         *;
*   HAVE SQLITE3.EXE DIRECTORY IN THE SYSTEM PATH.     *;
* LIMITATIONS:                                         *;
*   1. UNABLE TO HANDLE THE TABLE WITH BLOB DATA TYPE  *;
*   2. THE DEFAULT LENGTH OF ALL CHAR DATA IS 200      *;
*   3. SYSTEM DEPENDENT (CHANGES NEEDED FOR OTHER OS)  *;
*   4. POTENTIALLY SLOW FOR LARGE TABLES               *;
********************************************************;

%let chlen = 200;

options mlogic symbolgen mprint;

%macro sqlite(path = , dbfile = , table = );

%let fileref = %trim(&path)\%trim(&dbfile);

filename fileref "&fileref";

%if %sysfunc(fexist(fileref)) %then %do;

  %let dumpref = %trim(&path)\%trim(dump.txt);

  filename dumpref "&dumpref";

  %if %sysfunc(fexist(dumpref)) %then %do;
    data _null_;
      rc = fdelete("dumpref");
    run;
  %end;

  %let batref = %trim(&path)\%trim(dump.bat);

  filename batref "&batref";
  

  %if %sysfunc(fexist(dumpref)) %then %do;
    data _null_;
      rc = fdelete("batref");
    run;
  %end;

  data _null_;
    file batref;
    cmd = "sqlite3 &fileref "||'"'||".dump &table"||'"'||" > &dumpref";
    put cmd;
  run;
  
  systask command "&batref" wait;  
      
%end;
%else %goto exit;

data create;
  infile dumpref;
  input;
  create = upcase(_infile_);
  if _n_ > 1 then output;
  if substr(create, 1, 2) = ');' then stop;
run;

proc sql noprint;
select
  tranwrd(create, "TEXT", "CHAR(&chlen.)")
into
  :create separated by ' '
from
  create;

&create;
quit;    

proc datasets library = work nolist;
  delete create(mt = data);
run;
quit;

data _null_;
  infile dumpref end = eof;
  input;
  if _n_ = 1 then do;
    call execute("PROC SQL;");
  end;
  if index(upcase(_infile_), 'INSERT INTO') > 0 then do;
    pos1 = index(_infile_, '(');
    pos2 = index(_infile_, ');');    
    insert = substr(_infile_, pos1, pos2 - pos1 + 2);
    call execute("INSERT INTO &TABLE VALUES"||insert);
  end;
  if eof then do;
    call execute("QUIT;");
  end;
run;

data _null_;
  rc = fdelete("batref");
run;

data _null_;
  rc = fdelete("dumpref");
run;

proc contents data = &table varnum;
run;

%exit:
    
%mend sqlite;    

%sqlite(path = D:\sas, dbfile = test.db, table = tblco);
 Posted by at 12:57 下午  Tagged with:
1月 012010
 
################################################
# EXPORT DATA FRAME INTO SQLITE DATABASE       #
# USING SQLDF PACKAGE(code.google.com/p/sqldf) #
################################################

library(sqldf)
library(datasets)
data(CO2)

### SET WORKING DIRECTORY ###
setwd('d:\\r')

### TEST IF THE FILE EXISTS ###
if(file.exists('test.db')) file.remove('test.db')

### CREATE A NEW EMPTY SQLITE DATABASE ###
sqldf("attach 'test.db' as new")

### CREATE A NEW TABLE FROM DATA FRAME ###
sqldf("create table tblco as select * from CO2", dbname = 'test.db')

### SHOW THE DATA IN SQLITE DATABASE ###
sqldf("select * from tblco limit 5", dbname = 'test.db')
#  Plant   Type  Treatment conc uptake
#1   Qn1 Quebec nonchilled   95   16.0
#2   Qn1 Quebec nonchilled  175   30.4
#3   Qn1 Quebec nonchilled  250   34.8
#4   Qn1 Quebec nonchilled  350   37.2
#5   Qn1 Quebec nonchilled  500   35.3
 Posted by at 1:26 下午
12月 262009
 
###############################
# CALCULATE SUMMARY BY GROUPS #
###############################

data(iris)

### WITH AGGREGATE() FUNCTION ###
sum1 <- aggregate(iris[-5], iris[5], mean)
sum1
#     Species Sepal.Length Sepal.Width Petal.Length Petal.Width
#1     setosa        5.006       3.428        1.462       0.246
#2 versicolor        5.936       2.770        4.260       1.326
#3  virginica        6.588       2.974        5.552       2.026

### WITH BY() FUNCTION ###
tmp2 <- by(iris[-5], iris[5], mean)
sum2 <- cbind(expand.grid(dimnames(tmp2)), do.call(rbind, tmp2))
rownames(sum2) <- NULL
sum2              
#     Species Sepal.Length Sepal.Width Petal.Length Petal.Width
#1     setosa        5.006       3.428        1.462       0.246
#2 versicolor        5.936       2.770        4.260       1.326
#3  virginica        6.588       2.974        5.552       2.026

### WITH RESHAPE PACKAGE ###
library(reshape)
tmp3 <- melt(iris)
sum3 <- cast(tmp3, Species ~ variable, mean)
sum3
#     Species Sepal.Length Sepal.Width Petal.Length Petal.Width
#1     setosa        5.006       3.428        1.462       0.246
#2 versicolor        5.936       2.770        4.260       1.326
#3  virginica        6.588       2.974        5.552       2.026

### WITH SQLDF PACKAGE ###
library(sqldf)
sum4 <- sqldf("select
                 Species,
                 avg(Sepal_Length) as Sepal_Length,
                 avg(Sepal_Width)  as Sepal_Width,
                 avg(Petal_Length) as Petal_Length,
                 avg(Petal_Width)  as Petal_Width
               from
                 iris group by Species");
sum4
#     Species Sepal_Length Sepal_Width Petal_Length Petal_Width
#1     setosa        5.006       3.428        1.462       0.246
#2 versicolor        5.936       2.770        4.260       1.326
#3  virginica        6.588       2.974        5.552       2.026

### WITH LAPPLY() FUNCTION ###
tmp5 <- lapply(split(iris[-5], iris[5]), mean)
sum5 <- data.frame(Species = names(tmp5), do.call(rbind, tmp5))
rownames(sum5) <- NULL
sum5 
#     Species Sepal.Length Sepal.Width Petal.Length Petal.Width
#1     setosa        5.006       3.428        1.462       0.246
#2 versicolor        5.936       2.770        4.260       1.326
#3  virginica        6.588       2.974        5.552       2.026
 Posted by at 1:53 下午

From A Logical Point of View

 from a logical point of view, misc, quine, robert mitchum  From A Logical Point of View已关闭评论
12月 212009
 

QUIFRO

This blog, “From A Logical Point of View”, is not supposed to be owned by a logician. Actually, the book, From A Logical Point of View, is a collection of logical and philosophical essays by W.V.Quine(1908-2000), an American philosopher and mathematician.

A story about this book. From A Logical Point of View, was a calypso song by Robert Mitchum, an US actor, composer and singer. Quine enjoyed this music and used it as his new book, which is Quine’s best seller. Now I love this book and give the name to my blog from a logical point of view^.

12月 212009
 
If you have a magazine subscription or you read the magazines at the checkout counter, you know that they are always a month ahead of your calendar. My January 2010 magazines have already arrived and they are all talking about resolutions. Resolutions are a good thing, but shouldn't be limited to January 1. So try this year-long resolution -- challenge your comfort zone once each week. That's only 52 new things for 2010.

We all get in ruts and don't always realize it. These ruts can keep us from having innovative ideas, from meeting interesting people, or from finding a new favorite food. Stepping outside of your comfort zone can make you happier, healthier, and maybe wealthier if your innovative ideas are good enough.

I have started to work on this concept already. Here's my small, fun story. I went to a holiday lunch with a couple of friends. At the last minute, we picked the cafe in the landscaping company next to the resturant for which we had driven downtown. With a little encouragement, I bought a new swag to decorate my mantel. It is beautful and I can't wait for my family to take Christmas pictures in front of it.




My comfort zone challenge continues into the new year, personally and professionally. Over the next few weeks, I'll be finishing up a paper to be presented at SAS Global Forum 2010. This is a big step for many of us, but when you have information to share, the payoff for you and your audience can be well worth the sweaty palms and rapid heartbeat.

But, you don't have to go that far. I've provided a few simple personal and professional suggestions to help you challenge your comfort zone in 2010.

Start small. Try the resturant next door to your favorite resturant. Sit at a different table in your company cafe. Watch a video to learn a new skill instead of using your favorite documentation. Visit your local library and check out an author you have never heard of. Or, try one of these:

12月 162009
 
There is an ancient t-shirt that I have from my first users group conference. It dates from 1995, and it shows a cat holding a floppy disk and the words WUSS (for Western Users of SAS Software) on it. Never mind that I can’t fit into it anymore; Nor can I bear to turn it into a rag for washing my car. It’s a great t-shirt.

Don’t we all have a favorite t-shirt or two? And wouldn’t it be really cool to design something that others love to wear and might outgrow? Well – now’s your chance. There’s a contest on sasCommunity.org that you need to check out. A few quick rules: Original designs only / the t-shirt can be any representation of your use of SAS, the SAS Community in general, sasCommunity.org, or SAS Global Forum. / It can be humorous or serious – just keep it clean and original. And as you might expect, you have to agree to the Legal Mumbo Jumbo to submit. We have 3 submissions already!

It’s actually pretty simple: The winner will be announced on sasCommunity.org on February 8, 2010 and a limited number of the winning t-shirts will be distributed at SAS Global Forum 2010. For the record, I’ll be wearing an XL, unless my New Year’s resolution to lose 15 pounds is successful.