青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品

為生存而奔跑

   :: 首頁 :: 聯系 :: 聚合  :: 管理
  271 Posts :: 0 Stories :: 58 Comments :: 0 Trackbacks

留言簿(5)

我參與的團隊

搜索

  •  

積分與排名

  • 積分 - 331734
  • 排名 - 74

最新評論

閱讀排行榜

評論排行榜

------------------------- How to Use AWK --------------------------

Awk is an powerful command language that allows the user to manipulate files containing columns of data and strings. Awk is extremely useful, both for general operation of Unix commands, and for data reduction (e.g. IRAF). You might also learn how to use the stream editor sed. Many applications of awk resemble those done on PC spreadsheets.

This file contains a number of examples of how to use awk. I have compiled this table gradually over a couple of years as I've learned to do new things. Everyone who reduces data with IRAF should learn the fundamentals of AWK Learning to do even simple things will save you a lot of time in the long run. It should take you less than an hour to read through this file and learn the basics.

There are two ways to run awk. A simple awk command can be run from a single command line. More complex awk scripts should be written to a command file. I present examples of both types of input below.

Awk takes each line of input and tries to match the 'pattern' (see below), and if it succeeds it will do whatever you tell it to do within the {} (called the action). Awk works best on files that have columns of numbers or strings that are separated by whitespace (tabs or spaces), though on most machines you can use the -F option if your columns are set apart by another character. Awk refers to the first column as $1, the second column as $2, etc., and the whole line as $0. If you have a file (such as a catalog) that always has numbers in specific columns, you may also want to run the command 'colrm' and combine it with awk. There is a manual page on colrm. There is also a very incomplete man page on awk.

I'll lead you through two examples. First, suppose you have a file called 'file1' that has 2 columns of numbers, and you want to make a new file called 'file2' that has columns 1 and 2 as before, but also adds a third column which is the ratio of the numbers in columns 1 and 2. Suppose you want the new 3-column file (file2) to contain only those lines with column 1 smaller than column 2. Either of the following two commands does what you want:

awk '$1 < $2 {print $0, $1/$2}' file1 > file2

-- or --

cat file1 | awk '$1 < $2 {print $0, $1/$2}' > file2

Let's look at the second one. You all know that 'cat file1' prints the contents of file1 to your screen. The | (called a pipe) directs the output of 'cat file1', which normally goes to your screen, to the command awk. Awk considers the input from 'cat file1' one line at a time, and tries to match the 'pattern'. The pattern is whatever is between the first ' and the {, in this case the pattern is $1 < $2. If the pattern is false, awk goes on to the next line. If the pattern is true, awk does whatever is in the {}. In this case we have asked awk to check if the first column is less than the second. If there is no pattern, awk assumes the pattern is true, and goes onto the action contained in the {}.

What is the action? Almost always it is a print statement of some sort. In this case we want awk to print the entire line, i.e. $0, and then print the ratio of columns 1 and 2, i.e. $1/$2. We close the action with a }, and close the awk command with a '. Finally, to store the final 3-column output into file2 (otherwise it prints to the screen), we add a '> file2'.

As a second example, suppose you have several thousand files you want to move into a new directory and rename by appending a .dat to the filenames. You could do this one by one (several hours), or use vi to make a decent command file to do it (several minutes), or use awk (several seconds). Suppose the files are named junk* (* is wildcard for any sequence of characters), and need to be moved to ../iraf and have a '.dat' appended to the name. To do this type

ls junk* | awk '{print "mv "$0" ../iraf/"$0".dat"}' | csh

ls junk* lists the filenames, and this output is piped into awk instead of going to your screen. There is no pattern (nothing between the ' and the {), so awk proceeds to print something for each line. For example, if the first two lines from 'ls junk*' produced junk1 and junk2, respectively, then awk would print:

mv junk1 ../iraf/junk1.dat
mv junk2 ../iraf/junk2.dat

At this point the mv commands are simply printed to the screen. To execute the command we take the output of awk and pipe it back into the operating system (the C-shell). Hence, to finish the statement we add a ' | csh'.

More complex awk scripts need to be run from a file. The syntax for such cases is:

cat file1 | awk -f a.awk > file2

where file1 is the input file, file2 is the output file, and a.awk is a file containing awk commands. Examples below that contain more than one line of awk need to be run from files.

Some useful awk variables defined for you are NF (number of columns), NR (the current line that awk is working on), END (true if awk reaches the EOF), BEGIN (true before awk reads anything), and length (number of characters in a line or a string). There is also looping capability, a search (/) command, a substring command (extremely useful), and formatted printing available. There are logical variables || (or) and && (and) that can be used in 'pattern'. You can define and manipulate your own user defined variables. Examples are outlined below. The only bug I know of is that Sun's version of awk won't do trig functions, though it does do logs. There is something called gawk (a Gnu product), which does a few more things than Sun's awk, but they are basically the same. Note the use of the 'yes' command below. Coupled with 'head' and 'awk' you save an hour of typing if you have a lot of files to analyze or rename.

Good luck!
EXAMPLES      # is the comment character for awk.  'field' means 'column'

# Print first two fields in opposite order:
awk '{ print $2, $1 }' file


# Print lines longer than 72 characters:
awk 'length > 72' file


# Print length of string in 2nd column
awk '{print length($2)}' file


# Add up first column, print sum and average:
{ s += $1 }
END { print "sum is", s, " average is", s/NR }


# Print fields in reverse order:
awk '{ for (i = NF; i > 0; --i) print $i }' file


# Print the last line
{line = $0}
END {print line}


# Print the total number of lines that contain the word Pat
/Pat/ {nlines = nlines + 1}
END {print nlines}


# Print all lines between start/stop pairs:
awk '/start/, /stop/' file


# Print all lines whose first field is different from previous one:
awk '$1 != prev { print; prev = $1 }' file


# Print column 3 if column 1 > column 2:
awk '$1 > $2 {print $3}' file


# Print line if column 3 > column 2:
awk '$3 > $2' file


# Count number of lines where col 3 > col 1
awk '$3 > $1 {print i + "1"; i++}' file


# Print sequence number and then column 1 of file:
awk '{print NR, $1}' file


# Print every line after erasing the 2nd field
awk '{$2 = ""; print}' file


# Print hi 28 times
yes | head -28 | awk '{ print "hi" }'


# Print hi.0010 to hi.0099 (NOTE IRAF USERS!)
yes | head -90 | awk '{printf("hi00%2.0f \n", NR+9)}'

# Print out 4 random numbers between 0 and 1
yes | head -4 | awk '{print rand()}'

# Print out 40 random integers modulo 5
yes | head -40 | awk '{print int(100*rand()) % 5}'


# Replace every field by its absolute value
{ for (i = 1; i <= NF; i=i+1) if ($i < 0) $i = -$i print}

# If you have another character that delimits fields, use the -F option
# For example, to print out the phone number for Jones in the following file,
# 000902|Beavis|Theodore|333-242-2222|149092
# 000901|Jones|Bill|532-382-0342|234023
# ...
# type
awk -F"|" '$2=="Jones"{print $4}' filename



# Some looping commands
# Remove a bunch of print jobs from the queue
BEGIN{
for (i=875;i>833;i--){
printf "lprm -Plw %d\n", i
} exit
}


Formatted printouts are of the form printf( "format\n", value1, value2, ... valueN)
e.g. printf("howdy %-8s What it is bro. %.2f\n", $1, $2*$3)
%s = string
%-8s = 8 character string left justified
%.2f = number with 2 places after .
%6.2f = field 6 chars with 2 chars after .
\n is newline
\t is a tab


# Print frequency histogram of column of numbers
$2 <= 0.1 {na=na+1}
($2 > 0.1) && ($2 <= 0.2) {nb = nb+1}
($2 > 0.2) && ($2 <= 0.3) {nc = nc+1}
($2 > 0.3) && ($2 <= 0.4) {nd = nd+1}
($2 > 0.4) && ($2 <= 0.5) {ne = ne+1}
($2 > 0.5) && ($2 <= 0.6) {nf = nf+1}
($2 > 0.6) && ($2 <= 0.7) {ng = ng+1}
($2 > 0.7) && ($2 <= 0.8) {nh = nh+1}
($2 > 0.8) && ($2 <= 0.9) {ni = ni+1}
($2 > 0.9) {nj = nj+1}
END {print na, nb, nc, nd, ne, nf, ng, nh, ni, nj, NR}


# Find maximum and minimum values present in column 1
NR == 1 {m=$1 ; p=$1}
$1 >= m {m = $1}
$1 <= p {p = $1}
END { print "Max = " m, " Min = " p }

# Example of defining variables, multiple commands on one line
NR == 1 {prev=$4; preva = $1; prevb = $2; n=0; sum=0}
$4 != prev {print preva, prevb, prev, sum/n; n=0; sum=0; prev = $4; preva = $1; prevb = $2}
$4 == prev {n++; sum=sum+$5/$6}
END {print preva, prevb, prev, sum/n}

# Example of defining and using a function, inserting values into an array
# and doing integer arithmetic mod(n). This script finds the number of days
# elapsed since Jan 1, 1901. (from http://www.netlib.org/research/awkbookcode/ch3)
function daynum(y, m, d, days, i, n)
{ # 1 == Jan 1, 1901
split("31 28 31 30 31 30 31 31 30 31 30 31", days)
# 365 days a year, plus one for each leap year
n = (y-1901) * 365 + int((y-1901)/4)
if (y % 4 == 0) # leap year from 1901 to 2099
days[2]++
for (i = 1; i < m; i++)
n += days[i]
return n + d
}
{ print daynum($1, $2, $3) }

# Example of using substrings
# substr($2,9,7) picks out characters 9 thru 15 of column 2
{print "imarith", substr($2,1,7) " - " $3, "out."substr($2,5,3)}
{print "imarith", substr($2,9,7) " - " $3, "out."substr($2,13,3)}
{print "imarith", substr($2,17,7) " - " $3, "out."substr($2,21,3)}
{print "imarith", substr($2,25,7) " - " $3, "out."substr($2,29,3)}
posted on 2010-05-18 19:07 baby-fly 閱讀(384) 評論(0)  編輯 收藏 引用 所屬分類: Ubuntu&Linux
青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品
  • <ins id="pjuwb"></ins>
    <blockquote id="pjuwb"><pre id="pjuwb"></pre></blockquote>
    <noscript id="pjuwb"></noscript>
          <sup id="pjuwb"><pre id="pjuwb"></pre></sup>
            <dd id="pjuwb"></dd>
            <abbr id="pjuwb"></abbr>
            国产亚洲欧美日韩美女| 亚洲国产天堂久久综合网| 91久久精品一区| 午夜久久影院| 亚洲视频 欧洲视频| 亚洲精品四区| 亚洲在线成人精品| 噜噜噜久久亚洲精品国产品小说| 欧美大尺度在线| 欧美一级网站| 国产精品一香蕉国产线看观看 | 一区二区日韩| 欧美xxx成人| 欧美freesex8一10精品| 欧美成人中文字幕在线| 欧美精品成人91久久久久久久| 欧美激情中文字幕在线| 欧美午夜欧美| 影音先锋另类| 亚洲永久在线观看| 久久久久国产精品人| 亚洲国产裸拍裸体视频在线观看乱了中文| 欧美大成色www永久网站婷| 亚洲欧洲日产国产网站| 亚洲一区二区精品视频| 久久亚洲免费| 国产精品区一区二区三区| 黑人巨大精品欧美一区二区| 日韩一二三在线视频播| 久久电影一区| 亚洲乱亚洲高清| 久久爱91午夜羞羞| 欧美日韩国产免费观看| 在线观看精品视频| 欧美伊人久久久久久久久影院 | 国产精品你懂的在线欣赏| 亚洲成人原创| 欧美一区二区三区在线免费观看| 亚洲国产mv| 久久久噜噜噜久久人人看| 欧美系列精品| 一区二区高清在线| 欧美激情亚洲视频| 久久精品人人做人人爽| 国产麻豆精品theporn| 日韩视频精品在线观看| 欧美激情久久久久| 亚洲小视频在线观看| 亚洲深夜福利视频| 欧美日韩大片一区二区三区| 一区精品在线| 久久精品一二三| 亚洲亚洲精品三区日韩精品在线视频| 美国十次成人| 亚洲电影欧美电影有声小说| 久久久精品一品道一区| 午夜久久tv| 免费不卡视频| 亚洲欧美日韩精品| 国产精品国产三级国产专区53| 一区二区不卡在线视频 午夜欧美不卡在 | 亚洲午夜高清视频| 欧美日本高清视频| 亚洲乱码一区二区| 亚洲电影免费观看高清完整版| 久久精品免视看| 欧美日韩一区高清| 欧美日韩亚洲一区二区三区在线观看 | 久久字幕精品一区| 亚洲欧美日韩国产综合在线 | 日韩视频―中文字幕| 欧美精品久久99久久在免费线| 亚洲国内在线| 亚洲国产天堂久久综合网| 欧美成人免费网站| 一区二区三区高清| 亚洲特色特黄| 国产午夜一区二区三区| 另类激情亚洲| 欧美**人妖| 亚洲午夜久久久| 99国产精品99久久久久久粉嫩| 欧美日韩一区二区三区免费看 | 国产精品www| 欧美专区在线观看| 久久久久久久精| 日韩午夜剧场| 亚洲桃花岛网站| 激情av一区二区| 亚洲精品视频在线| 国产九区一区在线| 亚洲理论在线| 欧美激情麻豆| 午夜精品久久久久久久久久久久久| 亚洲特色特黄| 亚洲高清三级视频| 亚洲成色www8888| 亚洲国产岛国毛片在线| 欧美视频免费在线| 亚洲性人人天天夜夜摸| 国产精品v欧美精品v日韩 | 欧美精品少妇一区二区三区| 一区二区欧美日韩| 欧美在线视频在线播放完整版免费观看| 合欧美一区二区三区| 亚洲国产成人精品视频 | 亚洲国产视频一区| 正在播放欧美一区| 亚洲国产高清在线| 亚洲综合日韩在线| 亚洲精品自在久久| 欧美在线地址| 亚洲电影av| 亚洲一区久久久| 亚洲人成久久| 久久精品视频一| 午夜精品区一区二区三| 欧美精品日韩一区| 免费观看在线综合| 国产日韩欧美视频在线| 一区二区三区高清在线| 亚洲精品系列| 久久夜色精品国产噜噜av| 欧美一区二区三区四区夜夜大片| 欧美国产亚洲视频| 欧美大片一区二区| 国产一本一道久久香蕉| 亚洲视频欧美在线| 一区二区国产精品| 欧美精品免费在线| 亚洲第一偷拍| 亚洲国产一区二区在线| 久久久99免费视频| 久久亚洲影音av资源网| 国内外成人免费激情在线视频网站| 亚洲一级特黄| 欧美一区二区三区四区在线观看 | 欧美日韩成人在线播放| 亚洲国产成人在线播放| 亚洲人www| 欧美精品91| 亚洲精品乱码久久久久久久久| 亚洲精品黄色| 欧美精品v日韩精品v国产精品| 亚洲国产精品999| 99ri日韩精品视频| 欧美色中文字幕| 久久青青草原一区二区| 麻豆av一区二区三区久久| 欧美日本韩国| 欧美成人精品一区二区| 狠狠色丁香久久综合频道 | 久久精品国产第一区二区三区最新章节| 小处雏高清一区二区三区| 国产精品视频一区二区三区| 中文国产成人精品久久一| 亚洲免费影视第一页| 国产欧美韩国高清| 欧美一区二区高清| 美女久久网站| 一本色道久久综合一区 | 久久久久久久综合| 欧美高清视频在线播放| 亚洲精品久久嫩草网站秘色 | 久久永久免费| 亚洲美洲欧洲综合国产一区| 国产精品久久久久久久一区探花| 亚洲视频电影图片偷拍一区| 欧美一区二区在线| 亚洲第一精品久久忘忧草社区| 欧美国产一区二区| 亚洲欧美日韩综合| 欧美激情精品久久久久久大尺度| 亚洲午夜未删减在线观看| 国产美女一区二区| 蜜桃av噜噜一区二区三区| 一区二区三区四区精品| 老司机一区二区三区| 亚洲午夜视频在线观看| 精品动漫一区| 国产精品成人va在线观看| 久久久久久久性| 亚洲无人区一区| 亚洲国产日日夜夜| 久久99在线观看| 亚洲视频精选| 在线观看一区| 国产精品久久久久国产a级| 美女日韩欧美| 西瓜成人精品人成网站| 在线观看国产欧美| 欧美日韩亚洲一区二区| 欧美中文字幕久久| 一本久道久久综合狠狠爱| 老司机午夜免费精品视频| 亚洲欧美视频| 欧美一级久久久| 久久五月婷婷丁香社区| 在线综合亚洲| 亚洲激情第一区|