当前位置: 移动技术网 > 网络运营>服务器>虚拟主机 > 基于alpine用dockerfile创建的爬虫Scrapy镜像的实现

基于alpine用dockerfile创建的爬虫Scrapy镜像的实现

2019年04月17日  | 移动技术网网络运营  | 我要评论

一、下载alpine镜像

[root@dockerbrian ~]# docker pull alpine
using default tag: latest
trying to pull repository docker.io/library/alpine ...
latest: pulling from docker.io/library/alpine
4fe2ade4980c: pull complete
digest: sha256:621c2f39f8133acb8e64023a94dbdf0d5ca81896102b9e57c0dc184cadaf5528
status: downloaded newer image for docker.io/alpine:latest
[root@docker43 ~]# docker images
repository tag image id created size
docker.io/alpine latest 196d12cf6ab1 3 weeks ago 4.41 mb 

二、编写dockerfile

创建scrapy目录存放dockerfile文件

[root@dockerbrian ~]# mkdir /opt/alpinedockerfile/
[root@dockerbrian ~]# cd /opt/alpinedockerfile/
[root@dockerbrian alpinedockerfile]# mkdir scrapy && cd scrapy && touch dockerfile
[root@dockerbrian alpinedockerfile]# cd scrapy/
[root@dockerbrian scrapy]# ll
总用量 4
-rw-r--r-- 1 root root 1394 10月 10 11:36 dockerfile 

编写dockerfile文件

# 指定创建的基础镜像
from alpine
 
# 作者描述信息
maintainer alpine_python3_scrapy (zhujingzhi@123.com)
 
# 替换阿里云的源
run echo "http://mirrors.aliyun.com/alpine/latest-stable/main/" > /etc/apk/repositories && \
  echo "http://mirrors.aliyun.com/alpine/latest-stable/community/" >> /etc/apk/repositories
 
# 同步时间
 
# 更新源、安装openssh 并修改配置文件和生成key 并且同步时间
run apk update && \
  apk add --no-cache openssh-server tzdata && \
  cp /usr/share/zoneinfo/asia/shanghai /etc/localtime && \
  sed -i "s/#permitrootlogin.*/permitrootlogin yes/g" /etc/ssh/sshd_config && \
  ssh-keygen -t rsa -p "" -f /etc/ssh/ssh_host_rsa_key && \
  ssh-keygen -t ecdsa -p "" -f /etc/ssh/ssh_host_ecdsa_key && \
  ssh-keygen -t ed25519 -p "" -f /etc/ssh/ssh_host_ed25519_key && \
  echo "root:h056zhjlg85ow5xh7vtsa" | chpasswd
 
# 安装scrapy依赖包(必须安装的依赖包)
run apk add --no-cache python3 python3-dev gcc openssl-dev openssl libressl libc-dev linux-headers libffi-dev libxml2-dev libxml2 libxslt-dev openssh-client openssh-sftp-server
 
# 安装环境需要pip包(这里的包可以按照需求添加或者删除)
run pip3 install --default-timeout=100 --no-cache-dir --upgrade pip setuptools pymysql pymongo redis scrapy-redis ipython scrapy requests
 
# 启动ssh脚本
run echo "/usr/sbin/sshd -d" >> /etc/start.sh && \
  chmod +x /etc/start.sh
 
# 开放22端口
expose 22
 
# 执行ssh启动命令
cmd ["/bin/sh","/etc/start.sh"] 

实现了容器可以ssh远程访问 基于python3 环境安装的scrapy,通过start.sh脚本启动ssh服务

三、创建镜像

创建镜像

[root@dockerbrian scrapy]# docker build -t scrapy_redis_ssh:v1 . 

查看镜像

[root@dockerbrian scrapy]# docker images
repository     tag         image id      created       size
scrapy_redis_ssh  v1         b2c95ef95fb9    4 hours ago     282 mb
docker.io/alpine  latest       196d12cf6ab1    4 weeks ago     4.41 mb 

四、创建容器

创建容器(名字为scrapy10086 远程端口是映射宿主机10086端口)

复制代码 代码如下:
docker run -itd --restart=always --name scrapy10086 -p 10086:22 scrapy_redis_ssh:v1

查看容器

[root@dockerbrian scrapy]# docker ps
container id    image        command         created       status       ports          names
7fb9e69d79f5    b2c95ef95fb9    "/bin/sh /etc/star..."  3 hours ago     up 3 hours     0.0.0.0:10086->22/tcp  scrapy10086 

登录容器

[root@dockerbrian scrapy]# ssh root@127.0.0.1 -p 10086 
the authenticity of host '[127.0.0.1]:10086 ([127.0.0.1]:10086)' can't be established.
ecdsa key fingerprint is sha256:wc46au6sljhyefqwx6d6ht9mdpgkodemok6/concpxk.
ecdsa key fingerprint is md5:6a:b7:31:3c:63:02:ca:74:5b:d9:68:42:08:be:22:fc.
are you sure you want to continue connecting (yes/no)? yes
warning: permanently added '[127.0.0.1]:10086' (ecdsa) to the list of known hosts.
root@127.0.0.1's password:                                # 这里的密码就是dockerfile中定义的 echo "root:h056zhjlg85ow5xh7vtsa" | chpasswd
welcome to alpine!
 
the alpine wiki contains a large amount of how-to guides and general
information about administrating alpine systems.
see <http://wiki.alpinelinux.org>.
 
you can setup the system with the command: setup-alpine
 
you may change this message by editing /etc/motd.
 
7363738cc96a:~# 

五、测试

创建个scrapy项目测试

7363738cc96a:~# scrapy startproject test
new scrapy project 'test', using template directory '/usr/lib/python3.6/site-packages/scrapy/templates/project', created in:
  /root/test
 
you can start your first spider with:
  cd test
  scrapy genspider example example.com
7363738cc96a:~# cd test/
7363738cc96a:~/test# ls
scrapy.cfg test
7363738cc96a:~/test# cd test/
7363738cc96a:~/test/test# ls
__init__.py   __pycache__   items.py    middlewares.py pipelines.py  settings.py   spiders
7363738cc96a:~/test/test# 

测试成功

以上就是本文的全部内容,希望对大家的学习有所帮助,也希望大家多多支持移动技术网。

如对本文有疑问, 点击进行留言回复!!

相关文章:

验证码:
移动技术网