青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品

為生存而奔跑

   :: 首頁 :: 聯系 :: 聚合  :: 管理
  271 Posts :: 0 Stories :: 58 Comments :: 0 Trackbacks

留言簿(5)

我參與的團隊

搜索

  •  

積分與排名

  • 積分 - 331733
  • 排名 - 74

最新評論

閱讀排行榜

評論排行榜

------------------------- How to Use AWK --------------------------

Awk is an powerful command language that allows the user to manipulate files containing columns of data and strings. Awk is extremely useful, both for general operation of Unix commands, and for data reduction (e.g. IRAF). You might also learn how to use the stream editor sed. Many applications of awk resemble those done on PC spreadsheets.

This file contains a number of examples of how to use awk. I have compiled this table gradually over a couple of years as I've learned to do new things. Everyone who reduces data with IRAF should learn the fundamentals of AWK Learning to do even simple things will save you a lot of time in the long run. It should take you less than an hour to read through this file and learn the basics.

There are two ways to run awk. A simple awk command can be run from a single command line. More complex awk scripts should be written to a command file. I present examples of both types of input below.

Awk takes each line of input and tries to match the 'pattern' (see below), and if it succeeds it will do whatever you tell it to do within the {} (called the action). Awk works best on files that have columns of numbers or strings that are separated by whitespace (tabs or spaces), though on most machines you can use the -F option if your columns are set apart by another character. Awk refers to the first column as $1, the second column as $2, etc., and the whole line as $0. If you have a file (such as a catalog) that always has numbers in specific columns, you may also want to run the command 'colrm' and combine it with awk. There is a manual page on colrm. There is also a very incomplete man page on awk.

I'll lead you through two examples. First, suppose you have a file called 'file1' that has 2 columns of numbers, and you want to make a new file called 'file2' that has columns 1 and 2 as before, but also adds a third column which is the ratio of the numbers in columns 1 and 2. Suppose you want the new 3-column file (file2) to contain only those lines with column 1 smaller than column 2. Either of the following two commands does what you want:

awk '$1 < $2 {print $0, $1/$2}' file1 > file2

-- or --

cat file1 | awk '$1 < $2 {print $0, $1/$2}' > file2

Let's look at the second one. You all know that 'cat file1' prints the contents of file1 to your screen. The | (called a pipe) directs the output of 'cat file1', which normally goes to your screen, to the command awk. Awk considers the input from 'cat file1' one line at a time, and tries to match the 'pattern'. The pattern is whatever is between the first ' and the {, in this case the pattern is $1 < $2. If the pattern is false, awk goes on to the next line. If the pattern is true, awk does whatever is in the {}. In this case we have asked awk to check if the first column is less than the second. If there is no pattern, awk assumes the pattern is true, and goes onto the action contained in the {}.

What is the action? Almost always it is a print statement of some sort. In this case we want awk to print the entire line, i.e. $0, and then print the ratio of columns 1 and 2, i.e. $1/$2. We close the action with a }, and close the awk command with a '. Finally, to store the final 3-column output into file2 (otherwise it prints to the screen), we add a '> file2'.

As a second example, suppose you have several thousand files you want to move into a new directory and rename by appending a .dat to the filenames. You could do this one by one (several hours), or use vi to make a decent command file to do it (several minutes), or use awk (several seconds). Suppose the files are named junk* (* is wildcard for any sequence of characters), and need to be moved to ../iraf and have a '.dat' appended to the name. To do this type

ls junk* | awk '{print "mv "$0" ../iraf/"$0".dat"}' | csh

ls junk* lists the filenames, and this output is piped into awk instead of going to your screen. There is no pattern (nothing between the ' and the {), so awk proceeds to print something for each line. For example, if the first two lines from 'ls junk*' produced junk1 and junk2, respectively, then awk would print:

mv junk1 ../iraf/junk1.dat
mv junk2 ../iraf/junk2.dat

At this point the mv commands are simply printed to the screen. To execute the command we take the output of awk and pipe it back into the operating system (the C-shell). Hence, to finish the statement we add a ' | csh'.

More complex awk scripts need to be run from a file. The syntax for such cases is:

cat file1 | awk -f a.awk > file2

where file1 is the input file, file2 is the output file, and a.awk is a file containing awk commands. Examples below that contain more than one line of awk need to be run from files.

Some useful awk variables defined for you are NF (number of columns), NR (the current line that awk is working on), END (true if awk reaches the EOF), BEGIN (true before awk reads anything), and length (number of characters in a line or a string). There is also looping capability, a search (/) command, a substring command (extremely useful), and formatted printing available. There are logical variables || (or) and && (and) that can be used in 'pattern'. You can define and manipulate your own user defined variables. Examples are outlined below. The only bug I know of is that Sun's version of awk won't do trig functions, though it does do logs. There is something called gawk (a Gnu product), which does a few more things than Sun's awk, but they are basically the same. Note the use of the 'yes' command below. Coupled with 'head' and 'awk' you save an hour of typing if you have a lot of files to analyze or rename.

Good luck!
EXAMPLES      # is the comment character for awk.  'field' means 'column'

# Print first two fields in opposite order:
awk '{ print $2, $1 }' file


# Print lines longer than 72 characters:
awk 'length > 72' file


# Print length of string in 2nd column
awk '{print length($2)}' file


# Add up first column, print sum and average:
{ s += $1 }
END { print "sum is", s, " average is", s/NR }


# Print fields in reverse order:
awk '{ for (i = NF; i > 0; --i) print $i }' file


# Print the last line
{line = $0}
END {print line}


# Print the total number of lines that contain the word Pat
/Pat/ {nlines = nlines + 1}
END {print nlines}


# Print all lines between start/stop pairs:
awk '/start/, /stop/' file


# Print all lines whose first field is different from previous one:
awk '$1 != prev { print; prev = $1 }' file


# Print column 3 if column 1 > column 2:
awk '$1 > $2 {print $3}' file


# Print line if column 3 > column 2:
awk '$3 > $2' file


# Count number of lines where col 3 > col 1
awk '$3 > $1 {print i + "1"; i++}' file


# Print sequence number and then column 1 of file:
awk '{print NR, $1}' file


# Print every line after erasing the 2nd field
awk '{$2 = ""; print}' file


# Print hi 28 times
yes | head -28 | awk '{ print "hi" }'


# Print hi.0010 to hi.0099 (NOTE IRAF USERS!)
yes | head -90 | awk '{printf("hi00%2.0f \n", NR+9)}'

# Print out 4 random numbers between 0 and 1
yes | head -4 | awk '{print rand()}'

# Print out 40 random integers modulo 5
yes | head -40 | awk '{print int(100*rand()) % 5}'


# Replace every field by its absolute value
{ for (i = 1; i <= NF; i=i+1) if ($i < 0) $i = -$i print}

# If you have another character that delimits fields, use the -F option
# For example, to print out the phone number for Jones in the following file,
# 000902|Beavis|Theodore|333-242-2222|149092
# 000901|Jones|Bill|532-382-0342|234023
# ...
# type
awk -F"|" '$2=="Jones"{print $4}' filename



# Some looping commands
# Remove a bunch of print jobs from the queue
BEGIN{
for (i=875;i>833;i--){
printf "lprm -Plw %d\n", i
} exit
}


Formatted printouts are of the form printf( "format\n", value1, value2, ... valueN)
e.g. printf("howdy %-8s What it is bro. %.2f\n", $1, $2*$3)
%s = string
%-8s = 8 character string left justified
%.2f = number with 2 places after .
%6.2f = field 6 chars with 2 chars after .
\n is newline
\t is a tab


# Print frequency histogram of column of numbers
$2 <= 0.1 {na=na+1}
($2 > 0.1) && ($2 <= 0.2) {nb = nb+1}
($2 > 0.2) && ($2 <= 0.3) {nc = nc+1}
($2 > 0.3) && ($2 <= 0.4) {nd = nd+1}
($2 > 0.4) && ($2 <= 0.5) {ne = ne+1}
($2 > 0.5) && ($2 <= 0.6) {nf = nf+1}
($2 > 0.6) && ($2 <= 0.7) {ng = ng+1}
($2 > 0.7) && ($2 <= 0.8) {nh = nh+1}
($2 > 0.8) && ($2 <= 0.9) {ni = ni+1}
($2 > 0.9) {nj = nj+1}
END {print na, nb, nc, nd, ne, nf, ng, nh, ni, nj, NR}


# Find maximum and minimum values present in column 1
NR == 1 {m=$1 ; p=$1}
$1 >= m {m = $1}
$1 <= p {p = $1}
END { print "Max = " m, " Min = " p }

# Example of defining variables, multiple commands on one line
NR == 1 {prev=$4; preva = $1; prevb = $2; n=0; sum=0}
$4 != prev {print preva, prevb, prev, sum/n; n=0; sum=0; prev = $4; preva = $1; prevb = $2}
$4 == prev {n++; sum=sum+$5/$6}
END {print preva, prevb, prev, sum/n}

# Example of defining and using a function, inserting values into an array
# and doing integer arithmetic mod(n). This script finds the number of days
# elapsed since Jan 1, 1901. (from http://www.netlib.org/research/awkbookcode/ch3)
function daynum(y, m, d, days, i, n)
{ # 1 == Jan 1, 1901
split("31 28 31 30 31 30 31 31 30 31 30 31", days)
# 365 days a year, plus one for each leap year
n = (y-1901) * 365 + int((y-1901)/4)
if (y % 4 == 0) # leap year from 1901 to 2099
days[2]++
for (i = 1; i < m; i++)
n += days[i]
return n + d
}
{ print daynum($1, $2, $3) }

# Example of using substrings
# substr($2,9,7) picks out characters 9 thru 15 of column 2
{print "imarith", substr($2,1,7) " - " $3, "out."substr($2,5,3)}
{print "imarith", substr($2,9,7) " - " $3, "out."substr($2,13,3)}
{print "imarith", substr($2,17,7) " - " $3, "out."substr($2,21,3)}
{print "imarith", substr($2,25,7) " - " $3, "out."substr($2,29,3)}
posted on 2010-05-18 19:07 baby-fly 閱讀(384) 評論(0)  編輯 收藏 引用 所屬分類: Ubuntu&Linux
青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品
  • <ins id="pjuwb"></ins>
    <blockquote id="pjuwb"><pre id="pjuwb"></pre></blockquote>
    <noscript id="pjuwb"></noscript>
          <sup id="pjuwb"><pre id="pjuwb"></pre></sup>
            <dd id="pjuwb"></dd>
            <abbr id="pjuwb"></abbr>
            伊人成人在线视频| 久久性天堂网| 欧美成人自拍| 玖玖视频精品| 欧美成人午夜免费视在线看片| 久久综合久久久久88| 狂野欧美激情性xxxx| 欧美高清在线一区二区| 最近看过的日韩成人| 日韩视频在线观看| 午夜精品在线| 欧美激情按摩| 国产美女扒开尿口久久久| 红桃av永久久久| 一区二区三区日韩在线观看| 亚洲欧美日本日韩| 欧美 亚欧 日韩视频在线| 亚洲人成在线观看网站高清| 午夜精品亚洲| 欧美精品在线播放| 国产日韩欧美在线| 亚洲精品一区二区三区樱花| 欧美一区二区黄| 亚洲国产另类久久久精品极度| 中文精品在线| 欧美高清在线一区| 国产一区二区三区在线播放免费观看 | 一本一道久久综合狠狠老精东影业 | 欧美一区二区久久久| 欧美成人精品一区二区| 国产精品一区二区a| 日韩小视频在线观看| 在线亚洲欧美| 亚洲一区二区三区午夜| 欧美国产激情二区三区| 嫩草伊人久久精品少妇av杨幂| 欧美福利小视频| 久久久久久久欧美精品| 亚洲美女视频在线观看| 久久久久中文| 国产日产欧美一区| 亚洲午夜视频| 亚洲国产老妈| 亚洲人成人一区二区在线观看| 亚洲欧美日韩精品一区二区| 亚洲天堂av电影| 欧美人妖另类| 亚洲人成网站在线播| 久久综合电影| 久久国产欧美精品| 国产精品最新自拍| 午夜精品福利电影| 亚洲无线观看| 国产精品视频免费一区| 国产精品99久久久久久人| 亚洲国产欧美一区二区三区久久| 欧美伊人影院| 激情久久久久久久久久久久久久久久| 欧美在线免费观看亚洲| 亚洲欧美激情诱惑| 国产日韩欧美一区在线| 久久激情五月激情| 久久成人综合网| 欲色影视综合吧| 欧美阿v一级看视频| 可以看av的网站久久看| 最近中文字幕mv在线一区二区三区四区 | 久久永久免费| 久久综合九色综合久99| 亚洲韩国青草视频| 亚洲欧洲精品一区二区三区波多野1战4| 噜噜噜噜噜久久久久久91| 亚洲人成人77777线观看| 91久久久久久久久| 欧美视频一二三区| 午夜精品影院| 久久三级福利| 亚洲最新视频在线播放| 亚洲一区欧美激情| 国内精品久久久久影院色| 国产精品日韩| 欧美亚洲免费在线| 久久久久国产精品午夜一区| 亚洲日本乱码在线观看| 日韩亚洲综合在线| 国产婷婷色一区二区三区| 久久亚洲不卡| 欧美日韩国产一级片| 欧美在线视频一区二区三区| 久久五月天婷婷| 亚洲天堂久久| 久久久国产一区二区| 亚洲精品视频在线播放| 亚洲欧美日韩国产综合| 亚洲激情亚洲| 亚洲欧美99| 亚洲毛片视频| 欧美一区二视频| aaa亚洲精品一二三区| 欧美一区二区视频在线| 亚洲美洲欧洲综合国产一区| 亚洲一区日本| 亚洲狼人精品一区二区三区| 亚洲欧美欧美一区二区三区| 亚洲乱码国产乱码精品精可以看| 亚洲欧美日韩一区二区三区在线观看 | 亚洲高清影视| 亚洲一卡二卡三卡四卡五卡| 亚洲黄色av一区| 午夜精品一区二区三区在线播放| 亚洲精品你懂的| 久久国产精品久久精品国产| 亚洲一品av免费观看| 蜜桃伊人久久| 久久人人精品| 国产日韩欧美一二三区| 一个色综合导航| 91久久亚洲| 久久免费视频网站| 久久精品一本| 国产女主播视频一区二区| 一个色综合av| 亚洲综合日韩| 国产精品国产三级国产aⅴ入口| 欧美激情一区二区三区在线| 精品动漫一区| 欧美一区视频在线| 亚欧成人在线| 国产精品毛片高清在线完整版| 亚洲欧洲三级电影| 亚洲精品护士| 欧美激情视频免费观看| 欧美激情一区二区三区 | 亚洲精品国产品国语在线app | 欧美在线一区二区| 久久久久免费视频| 国产一区亚洲| 久久精品一区二区| 快射av在线播放一区| 玉米视频成人免费看| 99亚洲视频| 亚洲三级毛片| 亚洲人成亚洲人成在线观看| 久久综合一区二区| 欧美/亚洲一区| 亚洲卡通欧美制服中文| 欧美日韩和欧美的一区二区| 99国产精品久久| 亚洲女爱视频在线| 国产手机视频一区二区| 久久国产88| 亚洲第一在线| 一区二区三区 在线观看视| 欧美日韩免费在线| 亚洲一区综合| 玖玖精品视频| 亚洲深夜激情| 国产丝袜一区二区| 久久影视三级福利片| 亚洲三级影院| 欧美在线三区| 91久久精品一区| 欧美午夜电影一区| 久久国产视频网| 亚洲免费久久| 久热国产精品| 亚洲自拍高清| 亚洲国产精品123| 欧美午夜电影网| 久久久久综合网| 中文日韩在线视频| 久热精品在线视频| 亚洲四色影视在线观看| 韩国av一区| 国产精品成人一区二区| 久久久久久久久久久一区| 亚洲国产日韩欧美| 久久精品男女| 亚洲一区成人| 亚洲欧洲精品一区| 国产午夜精品福利| 欧美日韩精品免费观看| 久久久久国产免费免费| 中文成人激情娱乐网| 亚洲国产三级| 免费在线欧美黄色| 香蕉久久夜色精品| 一区二区三区国产精品| 亚洲成色777777女色窝| 国产精品女主播| 欧美精品在线观看播放| 久久国产精品久久久久久久久久 | 99re6热只有精品免费观看| 久久五月激情| 欧美中文字幕在线| 亚洲一区二区少妇| 夜夜夜精品看看| 亚洲欧洲在线视频| 亚洲第一在线综合网站| 韩日精品在线|