• <ins id="pjuwb"></ins>
    <blockquote id="pjuwb"><pre id="pjuwb"></pre></blockquote>
    <noscript id="pjuwb"></noscript>
          <sup id="pjuwb"><pre id="pjuwb"></pre></sup>
            <dd id="pjuwb"></dd>
            <abbr id="pjuwb"></abbr>
            我要啦免费统计

            from http://docs.continuum.io/anaconda-cluster/examples/spark-caffe

            Deep Learning (Spark, Caffe, GPU)

            Description

            To demonstrate the capability of running a distributed job in PySpark using a GPU, this example uses a neural network library, Caffe. Below is a trivial example of using Caffe on a Spark cluster; although this is redundant, it demonstrates the capability of training neural networks with GPUs.

            For this example, we recommend the use of the AMI ami-2cbf3e44 and the instance type g2.2xlarge. An example profile (to be placed in ~/.acluster/profiles.d/gpu_profile.yaml) is shown below:

            name: gpu_profile
            node_id: ami-2cbf3e44 # Ubuntu 14.04 - IS HVM - Cuda 6.5
            user: ubuntu
            node_type: g2.2xlarge
            num_nodes: 3
            provider: aws
            plugins:
              - spark-yarn
              - notebook
            

            Download

            To execute this example, download the: spark-caffe.py example script or spark-caffe.ipynbexample notebook.

            Installation

            The Spark + YARN plugin can be installed on the cluster using the following command:

            $ acluster install spark-yarn
            

            Once the Spark + YARN plugin is installed, you can view the YARN UI in your browser using the following command:

            $ acluster open yarn
            

            Dependencies

            First, we need to bootstrap Caffe and its dependencies on all of the nodes. We provide a bash script that will install Caffe from source: bootstrap-caffe.sh. The following command can be used to upload the bootstrap-caffe.sh script to all of the nodes and execute it in parallel:

            $ acluster submit bootstrap-caffe.sh --all
            

            After a few minues, Caffe and its dependencies will be installed on the cluster nodes and the job can be started.

            Running the Job

            Here is the complete script to run the Spark + GPU with Caffe example in PySpark:

            # spark-caffe.py from pyspark import SparkConf from pyspark import SparkContext  conf = SparkConf() conf.setMaster('yarn-client') conf.setAppName('spark-caffe') sc = SparkContext(conf=conf)   def noop(x):     import socket     return socket.gethostname()  rdd = sc.parallelize(range(2), 2) hosts = rdd.map(noop).distinct().collect() print hosts   def caffe_process(x):     import os     os.environ['PATH'] = '/usr/local/cuda/bin' + ':' + os.environ['PATH']     os.environ['LD_LIBRARY_PATH'] = '/usr/local/cuda/lib64:/home/ubuntu/pombredanne-https-gitorious.org-mdb-mdb.git-9cc04f604f80/libraries/liblmdb'     import subprocess     proc = subprocess.Popen('cd /home/ubuntu/caffe && bash ./examples/mnist/train_lenet.sh', shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)     out, err = proc.communicate()     return proc.returncode, out, err  rdd = sc.parallelize(range(2), 2) ret = rdd.map(caffe_process).distinct().collect() print ret 

            You can submit the script to the Spark cluster using the submit command.

            $ acluster submit spark-caffe.py 

            After the script completes, the trained Caffe model can be found at/home/ubuntu/caffe/examples/mnist/lenet_iter_10000.caffemodel on all of the compute nodes.

            posted on 2015-10-14 17:25 閱讀(3585) 評論(1)  編輯 收藏 引用 所屬分類: life 、關于人工智能的yy

            評論:
            # re: Deep Learning (Spark, Caffe, GPU) 2015-10-21 18:19 | 春秋十二月
            這是啥  回復  更多評論
              
            国产免费久久精品99re丫y| 久久91这里精品国产2020| 午夜天堂av天堂久久久| 无码人妻久久一区二区三区免费丨| 午夜人妻久久久久久久久| 欧美777精品久久久久网| 久久精品中文字幕有码| 性欧美大战久久久久久久久| 成人资源影音先锋久久资源网| 久久久久亚洲av毛片大| 国产精品免费看久久久| 一97日本道伊人久久综合影院| 国产精品视频久久久| 亚洲国产成人久久综合一区77| 国产一区二区精品久久| 亚洲午夜无码久久久久小说 | 色综合久久88色综合天天 | 亚洲AV日韩精品久久久久| 久久精品人妻一区二区三区| 精品久久久久久久无码| 久久婷婷五月综合色99啪ak| 天天爽天天爽天天片a久久网| 亚洲综合伊人久久综合| 亚洲国产成人久久一区久久| 国产精品欧美久久久久天天影视| 伊人久久大香线焦AV综合影院| 亚洲国产精品一区二区三区久久 | 天天爽天天狠久久久综合麻豆| 久久久久国产一级毛片高清板| 99国产欧美久久久精品蜜芽| 亚洲国产精品无码久久SM| 狠狠色丁香久久婷婷综合图片| 人妻系列无码专区久久五月天| 久久久久久久综合综合狠狠| 久久精品国产精品亚洲下载| 久久久精品一区二区三区| 久久精品人成免费| 久久人人爽人人爽人人AV东京热| 亚洲乱码精品久久久久..| 精品久久亚洲中文无码| 久久99久久99精品免视看动漫|