青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品

為生存而奔跑

   :: 首頁 :: 聯系 :: 聚合  :: 管理
  271 Posts :: 0 Stories :: 58 Comments :: 0 Trackbacks

留言簿(5)

我參與的團隊

搜索

  •  

積分與排名

  • 積分 - 331733
  • 排名 - 74

最新評論

閱讀排行榜

評論排行榜

------------------------- How to Use AWK --------------------------

Awk is an powerful command language that allows the user to manipulate files containing columns of data and strings. Awk is extremely useful, both for general operation of Unix commands, and for data reduction (e.g. IRAF). You might also learn how to use the stream editor sed. Many applications of awk resemble those done on PC spreadsheets.

This file contains a number of examples of how to use awk. I have compiled this table gradually over a couple of years as I've learned to do new things. Everyone who reduces data with IRAF should learn the fundamentals of AWK Learning to do even simple things will save you a lot of time in the long run. It should take you less than an hour to read through this file and learn the basics.

There are two ways to run awk. A simple awk command can be run from a single command line. More complex awk scripts should be written to a command file. I present examples of both types of input below.

Awk takes each line of input and tries to match the 'pattern' (see below), and if it succeeds it will do whatever you tell it to do within the {} (called the action). Awk works best on files that have columns of numbers or strings that are separated by whitespace (tabs or spaces), though on most machines you can use the -F option if your columns are set apart by another character. Awk refers to the first column as $1, the second column as $2, etc., and the whole line as $0. If you have a file (such as a catalog) that always has numbers in specific columns, you may also want to run the command 'colrm' and combine it with awk. There is a manual page on colrm. There is also a very incomplete man page on awk.

I'll lead you through two examples. First, suppose you have a file called 'file1' that has 2 columns of numbers, and you want to make a new file called 'file2' that has columns 1 and 2 as before, but also adds a third column which is the ratio of the numbers in columns 1 and 2. Suppose you want the new 3-column file (file2) to contain only those lines with column 1 smaller than column 2. Either of the following two commands does what you want:

awk '$1 < $2 {print $0, $1/$2}' file1 > file2

-- or --

cat file1 | awk '$1 < $2 {print $0, $1/$2}' > file2

Let's look at the second one. You all know that 'cat file1' prints the contents of file1 to your screen. The | (called a pipe) directs the output of 'cat file1', which normally goes to your screen, to the command awk. Awk considers the input from 'cat file1' one line at a time, and tries to match the 'pattern'. The pattern is whatever is between the first ' and the {, in this case the pattern is $1 < $2. If the pattern is false, awk goes on to the next line. If the pattern is true, awk does whatever is in the {}. In this case we have asked awk to check if the first column is less than the second. If there is no pattern, awk assumes the pattern is true, and goes onto the action contained in the {}.

What is the action? Almost always it is a print statement of some sort. In this case we want awk to print the entire line, i.e. $0, and then print the ratio of columns 1 and 2, i.e. $1/$2. We close the action with a }, and close the awk command with a '. Finally, to store the final 3-column output into file2 (otherwise it prints to the screen), we add a '> file2'.

As a second example, suppose you have several thousand files you want to move into a new directory and rename by appending a .dat to the filenames. You could do this one by one (several hours), or use vi to make a decent command file to do it (several minutes), or use awk (several seconds). Suppose the files are named junk* (* is wildcard for any sequence of characters), and need to be moved to ../iraf and have a '.dat' appended to the name. To do this type

ls junk* | awk '{print "mv "$0" ../iraf/"$0".dat"}' | csh

ls junk* lists the filenames, and this output is piped into awk instead of going to your screen. There is no pattern (nothing between the ' and the {), so awk proceeds to print something for each line. For example, if the first two lines from 'ls junk*' produced junk1 and junk2, respectively, then awk would print:

mv junk1 ../iraf/junk1.dat
mv junk2 ../iraf/junk2.dat

At this point the mv commands are simply printed to the screen. To execute the command we take the output of awk and pipe it back into the operating system (the C-shell). Hence, to finish the statement we add a ' | csh'.

More complex awk scripts need to be run from a file. The syntax for such cases is:

cat file1 | awk -f a.awk > file2

where file1 is the input file, file2 is the output file, and a.awk is a file containing awk commands. Examples below that contain more than one line of awk need to be run from files.

Some useful awk variables defined for you are NF (number of columns), NR (the current line that awk is working on), END (true if awk reaches the EOF), BEGIN (true before awk reads anything), and length (number of characters in a line or a string). There is also looping capability, a search (/) command, a substring command (extremely useful), and formatted printing available. There are logical variables || (or) and && (and) that can be used in 'pattern'. You can define and manipulate your own user defined variables. Examples are outlined below. The only bug I know of is that Sun's version of awk won't do trig functions, though it does do logs. There is something called gawk (a Gnu product), which does a few more things than Sun's awk, but they are basically the same. Note the use of the 'yes' command below. Coupled with 'head' and 'awk' you save an hour of typing if you have a lot of files to analyze or rename.

Good luck!
EXAMPLES      # is the comment character for awk.  'field' means 'column'

# Print first two fields in opposite order:
awk '{ print $2, $1 }' file


# Print lines longer than 72 characters:
awk 'length > 72' file


# Print length of string in 2nd column
awk '{print length($2)}' file


# Add up first column, print sum and average:
{ s += $1 }
END { print "sum is", s, " average is", s/NR }


# Print fields in reverse order:
awk '{ for (i = NF; i > 0; --i) print $i }' file


# Print the last line
{line = $0}
END {print line}


# Print the total number of lines that contain the word Pat
/Pat/ {nlines = nlines + 1}
END {print nlines}


# Print all lines between start/stop pairs:
awk '/start/, /stop/' file


# Print all lines whose first field is different from previous one:
awk '$1 != prev { print; prev = $1 }' file


# Print column 3 if column 1 > column 2:
awk '$1 > $2 {print $3}' file


# Print line if column 3 > column 2:
awk '$3 > $2' file


# Count number of lines where col 3 > col 1
awk '$3 > $1 {print i + "1"; i++}' file


# Print sequence number and then column 1 of file:
awk '{print NR, $1}' file


# Print every line after erasing the 2nd field
awk '{$2 = ""; print}' file


# Print hi 28 times
yes | head -28 | awk '{ print "hi" }'


# Print hi.0010 to hi.0099 (NOTE IRAF USERS!)
yes | head -90 | awk '{printf("hi00%2.0f \n", NR+9)}'

# Print out 4 random numbers between 0 and 1
yes | head -4 | awk '{print rand()}'

# Print out 40 random integers modulo 5
yes | head -40 | awk '{print int(100*rand()) % 5}'


# Replace every field by its absolute value
{ for (i = 1; i <= NF; i=i+1) if ($i < 0) $i = -$i print}

# If you have another character that delimits fields, use the -F option
# For example, to print out the phone number for Jones in the following file,
# 000902|Beavis|Theodore|333-242-2222|149092
# 000901|Jones|Bill|532-382-0342|234023
# ...
# type
awk -F"|" '$2=="Jones"{print $4}' filename



# Some looping commands
# Remove a bunch of print jobs from the queue
BEGIN{
for (i=875;i>833;i--){
printf "lprm -Plw %d\n", i
} exit
}


Formatted printouts are of the form printf( "format\n", value1, value2, ... valueN)
e.g. printf("howdy %-8s What it is bro. %.2f\n", $1, $2*$3)
%s = string
%-8s = 8 character string left justified
%.2f = number with 2 places after .
%6.2f = field 6 chars with 2 chars after .
\n is newline
\t is a tab


# Print frequency histogram of column of numbers
$2 <= 0.1 {na=na+1}
($2 > 0.1) && ($2 <= 0.2) {nb = nb+1}
($2 > 0.2) && ($2 <= 0.3) {nc = nc+1}
($2 > 0.3) && ($2 <= 0.4) {nd = nd+1}
($2 > 0.4) && ($2 <= 0.5) {ne = ne+1}
($2 > 0.5) && ($2 <= 0.6) {nf = nf+1}
($2 > 0.6) && ($2 <= 0.7) {ng = ng+1}
($2 > 0.7) && ($2 <= 0.8) {nh = nh+1}
($2 > 0.8) && ($2 <= 0.9) {ni = ni+1}
($2 > 0.9) {nj = nj+1}
END {print na, nb, nc, nd, ne, nf, ng, nh, ni, nj, NR}


# Find maximum and minimum values present in column 1
NR == 1 {m=$1 ; p=$1}
$1 >= m {m = $1}
$1 <= p {p = $1}
END { print "Max = " m, " Min = " p }

# Example of defining variables, multiple commands on one line
NR == 1 {prev=$4; preva = $1; prevb = $2; n=0; sum=0}
$4 != prev {print preva, prevb, prev, sum/n; n=0; sum=0; prev = $4; preva = $1; prevb = $2}
$4 == prev {n++; sum=sum+$5/$6}
END {print preva, prevb, prev, sum/n}

# Example of defining and using a function, inserting values into an array
# and doing integer arithmetic mod(n). This script finds the number of days
# elapsed since Jan 1, 1901. (from http://www.netlib.org/research/awkbookcode/ch3)
function daynum(y, m, d, days, i, n)
{ # 1 == Jan 1, 1901
split("31 28 31 30 31 30 31 31 30 31 30 31", days)
# 365 days a year, plus one for each leap year
n = (y-1901) * 365 + int((y-1901)/4)
if (y % 4 == 0) # leap year from 1901 to 2099
days[2]++
for (i = 1; i < m; i++)
n += days[i]
return n + d
}
{ print daynum($1, $2, $3) }

# Example of using substrings
# substr($2,9,7) picks out characters 9 thru 15 of column 2
{print "imarith", substr($2,1,7) " - " $3, "out."substr($2,5,3)}
{print "imarith", substr($2,9,7) " - " $3, "out."substr($2,13,3)}
{print "imarith", substr($2,17,7) " - " $3, "out."substr($2,21,3)}
{print "imarith", substr($2,25,7) " - " $3, "out."substr($2,29,3)}
posted on 2010-05-18 19:07 baby-fly 閱讀(384) 評論(0)  編輯 收藏 引用 所屬分類: Ubuntu&Linux
青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品
  • <ins id="pjuwb"></ins>
    <blockquote id="pjuwb"><pre id="pjuwb"></pre></blockquote>
    <noscript id="pjuwb"></noscript>
          <sup id="pjuwb"><pre id="pjuwb"></pre></sup>
            <dd id="pjuwb"></dd>
            <abbr id="pjuwb"></abbr>
            欧美经典一区二区三区| 亚洲小说欧美另类社区| 蜜臀a∨国产成人精品| 久久精品久久综合| 久久美女艺术照精彩视频福利播放| 国产日本欧美视频| 老牛影视一区二区三区| 老鸭窝毛片一区二区三区| 免费视频亚洲| 欧美午夜影院| 韩国福利一区| 亚洲电影毛片| 亚洲视频电影在线| 久久大逼视频| 欧美国产第一页| 亚洲青色在线| 欧美日韩一区二区免费视频| 亚洲午夜91| 久久精品国产清高在天天线| 男女av一区三区二区色多| 最新国产の精品合集bt伙计| 一级成人国产| 久久精品国产亚洲一区二区三区 | 亚洲一区二区三区欧美| 欧美在线观看视频一区二区| 欧美91视频| 国产欧美在线| 99精品福利视频| 久久伊人免费视频| 一本到高清视频免费精品| 欧美一区二区女人| 欧美日韩另类丝袜其他| 在线免费观看视频一区| 欧美一区二区高清| 亚洲精品久久久一区二区三区| 欧美在线观看www| 国产精品xxxav免费视频| 91久久黄色| 久久婷婷一区| 午夜欧美视频| 亚洲午夜三级在线| 亚洲第一搞黄网站| 久久美女性网| 亚洲伊人久久综合| 欧美日韩国产精品成人| 亚洲国产精品高清久久久| 久久国产一区二区| 亚洲影音一区| 国产精品欧美日韩一区二区| 一区二区三区.www| 91久久香蕉国产日韩欧美9色 | 老司机久久99久久精品播放免费| 欧美四级在线| 99国产精品久久| 亚洲黄色视屏| 欧美欧美天天天天操| 99国产一区| 亚洲电影免费观看高清完整版在线观看| 欧美日韩调教| 亚洲最黄网站| 亚洲精品久久7777| 欧美精品久久99| 亚洲国产成人久久综合| 免费永久网站黄欧美| 久久精品中文| 在线看片第一页欧美| 欧美成人久久| 影音先锋日韩有码| 国产精品porn| 午夜综合激情| 欧美一区二区在线免费播放| 国产一区二区丝袜高跟鞋图片| 亚洲欧美日韩一区| 午夜精品婷婷| 在线观看一区| 亚洲人精品午夜在线观看| 欧美日本高清| 欧美专区一区二区三区| 久久久久久网址| 99精品欧美一区二区三区| 一区二区三区免费看| 国产午夜精品在线| 欧美 日韩 国产在线| 欧美激情在线播放| 午夜欧美视频| 玖玖国产精品视频| 国产精品99久久久久久久女警| 亚洲免费在线观看| 亚洲高清在线| 亚洲网站在线播放| 在线电影一区| 一本色道久久综合亚洲二区三区| 国产情人节一区| 亚洲国产精品久久久久秋霞影院| 欧美先锋影音| 欧美xxx成人| 国产精品久久久久高潮| 卡一卡二国产精品| 美女性感视频久久久| 亚洲一区二区欧美日韩| 久久嫩草精品久久久久| 亚洲欧美国产精品va在线观看 | 欧美成人69av| 国产精品久久久久久超碰| 欧美福利电影网| 国产精品亚洲一区二区三区在线| 免费中文字幕日韩欧美| 欧美午夜不卡| 亚洲国产成人精品女人久久久 | 久久国产日韩欧美| 欧美日韩另类国产亚洲欧美一级| 狼狼综合久久久久综合网 | 欧美电影资源| 欧美午夜在线观看| 亚洲国产精品女人久久久| 国产午夜精品全部视频播放| 久久久久国色av免费观看性色| 欧美成人首页| 老司机免费视频一区二区三区| 欧美日韩亚洲在线| 亚洲国产日韩美| 怡红院av一区二区三区| 校园春色综合网| 亚洲欧美在线磁力| 欧美日韩美女在线观看| 亚洲激情综合| 亚洲精品乱码久久久久久蜜桃麻豆| 欧美一级片一区| 欧美在线视频网站| 国产精品永久免费在线| 亚洲午夜一区| 亚洲欧美在线另类| 国产精品毛片a∨一区二区三区| 一本色道婷婷久久欧美| 制服诱惑一区二区| 国产精品福利在线| 亚洲私人黄色宅男| 午夜精品视频| 国产日韩欧美一区在线| 欧美亚洲一级片| 久久久久久久久岛国免费| 国产主播一区二区三区| 久久精品人人做人人爽电影蜜月| 久久人人九九| 亚洲国产精品va在看黑人| 久久蜜臀精品av| 欧美激情日韩| 夜夜狂射影院欧美极品| 狠狠综合久久| 夜夜嗨av一区二区三区| 亚洲一区二区三区精品在线观看| 欧美日韩中文另类| 亚洲欧美视频在线观看| 看片网站欧美日韩| 99国产精品国产精品毛片| 欧美午夜精品久久久久久浪潮| 亚洲免费一级电影| 久久综合伊人77777麻豆| 亚洲三级电影全部在线观看高清| 欧美日产国产成人免费图片| 亚洲视频你懂的| 久久人人看视频| 一本色道久久综合亚洲二区三区| 国产精品免费视频xxxx| 久久久午夜视频| 亚洲美女免费视频| 久久精品人人做人人爽| 亚洲国产一区在线观看| 国产精品亚洲一区二区三区在线| 久久偷窥视频| 亚洲视频在线观看视频| 蜜臀av在线播放一区二区三区| 一卡二卡3卡四卡高清精品视频| 国产精品一区二区男女羞羞无遮挡 | 欧美国产日韩xxxxx| 欧美久久久久久久| 亚洲欧美清纯在线制服| 娇妻被交换粗又大又硬视频欧美| 欧美国产日本韩| 午夜日韩激情| 亚洲日本中文字幕| 久久免费国产精品1| 一区二区欧美日韩视频| 在线观看一区二区精品视频| 国产精品久久综合| 欧美成人免费在线观看| 午夜久久99| 一区二区三区三区在线| 亚洲国产精品久久精品怡红院| 久久男人资源视频| 欧美一区二区三区在| 国产精品99久久99久久久二8 | 在线观看欧美黄色| 国产精品视频精品| 欧美日韩亚洲一区二区三区在线| 美国十次成人| 久久久噜久噜久久综合| 欧美一区二区三区婷婷月色| 亚洲图片激情小说| 一二三区精品福利视频|