我对Python非常陌生,并试图在Windows 7上安装链接检查器。一些注意事项:

pip install is failing no matter the package. For example, > pip install scrapy also results in the SSL error. Vanilla install of Python 3.4.1 included pip 1.5.6. The first thing I tried to do was install linkchecker. Python 2.7 was already installed, it came with ArcGIS. python and pip were not available from the command line until I installed 3.4.1. > pip search linkchecker works. Perhaps that is because pip search does not verify the site's SSL certificate. I am in a company network but we do not go through a proxy to reach the Internet. Each company computer (including mine) has a Trusted Root Certificate Authority that is used for various reasons including enabling monitoring TLS traffic to https://google.com. Not sure if that has anything to do with it.

下面是运行pip install linkchecker后我的pip.log的内容:

Downloading/unpacking linkchecker
  Getting page https://pypi.python.org/simple/linkchecker/
  Could not fetch URL https://pypi.python.org/simple/linkchecker/: connection error: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:598)
  Will skip URL https://pypi.python.org/simple/linkchecker/ when looking for download links for linkchecker
  Getting page https://pypi.python.org/simple/
  Could not fetch URL https://pypi.python.org/simple/: connection error: HTTPSConnectionPool(host='pypi.python.org', port=443): Max retries exceeded with url: /simple/ (Caused by <class 'http.client.CannotSendRequest'>: Request-sent)
  Will skip URL https://pypi.python.org/simple/ when looking for download links for linkchecker
  Cannot fetch index base URL https://pypi.python.org/simple/
  URLs to search for versions for linkchecker:
  * https://pypi.python.org/simple/linkchecker/
  Getting page https://pypi.python.org/simple/linkchecker/
  Could not fetch URL https://pypi.python.org/simple/linkchecker/: connection error: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:598)
  Will skip URL https://pypi.python.org/simple/linkchecker/ when looking for download links for linkchecker
  Could not find any downloads that satisfy the requirement linkchecker
Cleaning up...
  Removing temporary dir C:\Users\jcook\AppData\Local\Temp\pip_build_jcook...
No distributions at all found for linkchecker
Exception information:
Traceback (most recent call last):
  File "C:\Python34\lib\site-packages\pip\basecommand.py", line 122, in main
    status = self.run(options, args)
  File "C:\Python34\lib\site-packages\pip\commands\install.py", line 278, in run
    requirement_set.prepare_files(finder, force_root_egg_info=self.bundle, bundle=self.bundle)
  File "C:\Python34\lib\site-packages\pip\req.py", line 1177, in prepare_files
    url = finder.find_requirement(req_to_install, upgrade=self.upgrade)
  File "C:\Python34\lib\site-packages\pip\index.py", line 277, in find_requirement
    raise DistributionNotFound('No distributions at all found for %s' % req)
pip.exceptions.DistributionNotFound: No distributions at all found for linkchecker

当前回答

如果你使用的是虚拟环境,我会推荐使用这个解决方案,

Pyenv install (wanted_python_version)

其他回答

我也遇到过类似的问题。对我有效的解决方法 1)卸载python 2.7 2)删除python27文件夹 3)重新安装最新的python

pip install gensim config --global http.sslVerify false

只需安装任何带有“config——global http. conf”的包。sslVerify false”语句

可以通过将pypi.org和files.pythonhosted.org以及旧的pypi.python.org设置为可信主机来忽略SSL错误。

$ pip install --trusted-host pypi.org --trusted-host pypi.python.org --trusted-host files.pythonhosted.org <package_name>

注意:在2018年4月的某个时候,Python包索引从pypi.python.org迁移到pypi.org。这意味着使用旧域的“可信主机”命令不再有效,但您可以同时添加这两个命令。

永久解决

自从pip 10.0发布以来,你应该可以通过升级pip本身来永久地修复这个问题:

$ pip install --trusted-host pypi.org --trusted-host pypi.python.org --trusted-host files.pythonhosted.org pip setuptools

或者重新安装以获得最新版本:

$ curl https://bootstrap.pypa.io/get-pip.py -o get-pip.py

(…然后使用相关的Python解释器运行get-pip.py)。

PIP install <otherpackage>应该在此之后工作。如果没有,那么您将需要做更多的工作,如下所述。


您可能希望将受信任主机和代理添加到配置文件中。

pip.ini (Windows)或pip.conf (unix)

[global]
trusted-host = pypi.python.org
               pypi.org
               files.pythonhosted.org

替代解决方案(不太安全)

大多数答案都可能带来安全问题。

有两个变通方法可以帮助您轻松安装大多数python包:

使用easy_install:如果您真的很懒,不想浪费太多时间,请使用easy_install <package_name>。注意,有些包找不到,或者会出现小错误。 使用Wheel:下载python包的Wheel,使用pip命令pip install wheel_package_name。WHL安装包。

我最近遇到了这个问题,因为我公司的网页内容过滤器使用自己的证书颁发机构,以便它可以过滤SSL流量。在我的情况下,PIP似乎没有使用系统的CA证书,产生了您提到的错误。后来,将PIP降级到1.2.1版本也出现了一系列问题,所以我回到了Python 3.4附带的原始版本。

我的解决方法非常简单:使用easy_install。要么它不检查certs(就像旧的PIP版本一样),要么它知道使用系统certs,因为它每次都对我有效,我仍然可以使用PIP卸载安装在easy_install中的包。

如果这不起作用,并且您可以访问没有问题的网络或计算机,您总是可以设置自己的个人PyPI服务器:如何在没有镜像的情况下创建本地自己的PyPI存储库索引?

在尝试使用easy_install作为最后的努力之前,我几乎是这样做的。

尽管有40个答案,但我认为没有一个完全解决了我的问题。

我在macOS Catalina 10.15.5上,在公司代理的后面。

在尝试安装或升级包时,提示以下错误

>>> pip install <package name>                                                                                                                           

Looking in indexes: https://pypi.org/simple, https://data:****@pypi.<company>.com/simple/
WARNING: Retrying (Retry(total=4, connect=None, read=None, redirect=None, status=None)) after connection broken by 'SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1108)'))': <package name>
WARNING: Retrying (Retry(total=3, connect=None, read=None, redirect=None, status=None)) after connection broken by 'SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1108)'))': <package name>

@Steve_Tauber的回答pip——cert /etc/ssl/certs/FOO_Root_CA。Pem安装链接检查器让我的方式有一部分。

我能够使用现有的cert文件成功安装包,如下所示:

pip install --cert /Users/me/opt/anaconda3/ssl/cert.pem --upgrade pip

但是我不想每次使用pip时都使用cert标志…

答案是更新环境变量:

CERT_PATH=/Users/me/opt/anaconda3/ssl/cert.pem
export SSL_CERT_FILE=${CERT_PATH}
export REQUESTS_CA_BUNDLE=${CERT_PATH}

现在我可以安装了。

时间和日期设置正确!

对我来说,树莓派上的日期和时间配置错误。结果是使用https://files.pythonhosted.org/服务器的所有SSL和HTTPS连接都失败了。

像这样更新:

sudo date -s "Wed Thu  23 11:12:00 GMT+1 2018"
sudo dpkg-reconfigure tzdata

或者直接与谷歌的时间连用:

Ref。https://superuser.com/a/635024/935136

sudo date -s "$(curl -s --head http://google.com | grep ^Date: | sed 's/Date: //g')"
sudo dpkg-reconfigure tzdata