未知错误: 会话删除,因为页面崩溃从未知错误: 无法确定加载状态从标签崩溃与 ChromeDriver Selenium

我使用的是使用 Python 和 Selenium 的 InstaPy。我根据 Cron 启动脚本,并且有时候它会崩溃。所以它真的很不规则,有时候它能很好地贯穿。我也已经在 GitHub Repo 上发帖了,但是没有得到回复,所以我现在在这里询问是否有人知道为什么。

这是个数字海洋 Ubuntu 服务器,我用的是无头模式。驱动程序版本在日志中可见。以下是错误消息:

ERROR [2018-12-10 09:53:54] [user]  Error occurred while deleting cookies from web browser!
b'Message: invalid session id\n  (Driver info: chromedriver=2.44.609551 (5d576e9a44fe4c5b6a07e568f1ebc753f1214634),platform=Linux 4.15.0-42-generic x86_64)\n'
Traceback (most recent call last):
File "/root/InstaPy/instapy/util.py", line 1410, in smart_run
yield
File "./my_config.py", line 43, in <module>
session.follow_user_followers(['xxxx','xxxx','xxxx','xxxx'], amount=100, randomize=True, interact=True)
File "/root/InstaPy/instapy/instapy.py", line 2907, in follow_user_followers
self.logfolder)
File "/root/InstaPy/instapy/unfollow_util.py", line 883, in get_given_user_followers
channel, jumps, logger, logfolder)
File "/root/InstaPy/instapy/unfollow_util.py", line 722, in get_users_through_dialog
person_list = dialog_username_extractor(buttons)
File "/root/InstaPy/instapy/unfollow_util.py", line 747, in dialog_username_extractor
person_list.append(person.find_element_by_xpath("../../../*")
File "/usr/local/lib/python3.6/dist-packages/selenium/webdriver/remote/webelement.py", line 351, in find_element_by_xpath
return self.find_element(by=By.XPATH, value=xpath)
File "/usr/local/lib/python3.6/dist-packages/selenium/webdriver/remote/webelement.py", line 659, in find_element
{"using": by, "value": value})['value']
File "/usr/local/lib/python3.6/dist-packages/selenium/webdriver/remote/webelement.py", line 633, in _execute
return self._parent.execute(command, params)
File "/usr/local/lib/python3.6/dist-packages/selenium/webdriver/remote/webdriver.py", line 321, in execute
self.error_handler.check_response(response)
File "/usr/local/lib/python3.6/dist-packages/selenium/webdriver/remote/errorhandler.py", line 242, in check_response
raise exception_class(message, screen, stacktrace)
selenium.common.exceptions.WebDriverException: Message: unknown error: session deleted because of page crash
from unknown error: cannot determine loading status
from tab crashed
(Session info: headless chrome=70.0.3538.110)
(Driver info: chromedriver=2.44.609551 (5d576e9a44fe4c5b6a07e568f1ebc753f1214634),platform=Linux 4.15.0-42-generic x86_64)


During handling of the above exception, another exception occurred:


Traceback (most recent call last):
File "/root/InstaPy/instapy/instapy.py", line 3845, in end
self.browser.delete_all_cookies()
File "/usr/local/lib/python3.6/dist-packages/selenium/webdriver/remote/webdriver.py", line 878, in delete_all_cookies
self.execute(Command.DELETE_ALL_COOKIES)
File "/usr/local/lib/python3.6/dist-packages/selenium/webdriver/remote/webdriver.py", line 321, in execute
self.error_handler.check_response(response)
File "/usr/local/lib/python3.6/dist-packages/selenium/webdriver/remote/errorhandler.py", line 242, in check_response
raise exception_class(message, screen, stacktrace)
selenium.common.exceptions.WebDriverException: Message: chrome not reachable
(Session info: headless chrome=71.0.3578.80)
(Driver info: chromedriver=2.44.609551 (5d576e9a44fe4c5b6a07e568f1ebc753f1214634),platform=Linux 4.15.0-42-generic x86_64)

你知道原因是什么,怎么解决吗?

谢谢你的建议 http://treestones.ch/的人帮了我。

86447 次浏览

Though you see the error as:

Error occurred while deleting cookies from web browser!
b'Message: invalid session id\n  (Driver info: chromedriver=2.44.609551 (5d576e9a44fe4c5b6a07e568f1ebc753f1214634),platform=Linux 4.15.0-42-generic x86_64)\n'

The main exception is:

selenium.common.exceptions.WebDriverException: Message: unknown error: session deleted because of page crash
from unknown error: cannot determine loading status
from tab crashed

Your code trials would have given us some clues what going wrong.


Solution

There are diverse solution to this issue. However as per UnknownError: session deleted because of page crash from tab crashed this issue can be solved by either of the following solutions:

  • Add the following chrome_options:

    chrome_options.add_argument('--no-sandbox')
    
  • Chrome seem to crash in Docker containers on certain pages due to too small /dev/shm. So you may have to fix the small /dev/shm size.

  • An example:

    sudo mount -t tmpfs -o rw,nosuid,nodev,noexec,relatime,size=512M tmpfs /dev/shm
    
  • It also works if you use -v /dev/shm:/dev/shm option to share host /dev/shm

  • Another way to make it work would be to add the chrome_options as --disable-dev-shm-usage. This will force Chrome to use the /tmp directory instead. This may slow down the execution though since disk will be used instead of memory.

    chrome_options.add_argument('--disable-dev-shm-usage')
    

from tab crashed

from tab crashed was WIP(Work In Progress) with the Chromium Team for quite some time now which relates to Linux attempting to always use /dev/shm for non-executable memory. Here are the references :


Reference

You can find a couple of relevant discussions in:

I was getting the following error on my Ubuntu server:

selenium.common.exceptions.WebDriverException: Message: unknown error: session deleted because of page crash from tab crashed (Session info: headless chrome=86.0.4240.111) (Driver info: chromedriver=2.41.578700 (2f1ed5f9343c13f73144538f15c00b370eda6706),platform=Linux 5.4.0-1029-aws x86_64)

It turned out the the cause of the error was insufficient disk space on the server and the solution was to extend my disk space. You can check this question for more information.

In case someone is facing this problem with docker containers:

use the flag --shm-size=2g when creating the container and the error is gone. This flag make the container to use the host's shared memory.

Example

$ docker run -d --net gridNet2020 --shm-size="2g" -e SE_OPTS="-browser applicationName=zChromeNodePdf30,browserName=chrome,maxInstances=1,version=78.0_debug_pdf" -e HUB_HOST=selenium-hub-3.141.59 -P -p 5700:5555 --name zChromeNodePdf30 -v /var/lib/docker/sharedFolder:/home/seluser/Downloads selenium/node-chrome:3.141.59-xenon

Source: https://github.com/SeleniumHQ/docker-selenium

We need to specify the shm memory separatly, --shm-size=2g In case of docker, use the following config - this working fine for me


services: chrome: image: selenium/node-chrome:4.0.0-rc-1-prerelease-20210823 shm_size: 2gb

This happened to me while trying to open a new web page with the same driver in Chromium. It worked fine in my local machine where I use Chrome.

Did not worked:

driver = webdriver.Chrome(options=options)
driver.execute_script("Object.defineProperty(navigator, 'webdriver', {get: () => undefined})")
driver.execute_cdp_cmd('Network.setUserAgentOverride', {
"userAgent": 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/83.0.4103.53 Safari/537.36'})


driver.get('url1')
# Do operations with url1


driver.get('url2')
# Do operations with url2 -> did not work and crashed

Below is the solution, I am using which is working for me. i.e re-initializing the driver

def setup_driver():
global driver
driver = webdriver.Chrome(options=options)
driver.maximize_window()
driver.execute_script("Object.defineProperty(navigator, 'webdriver', {get: () => undefined})")
driver.execute_cdp_cmd('Network.setUserAgentOverride', {
"userAgent": 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/83.0.4103.53 Safari/537.36'})




setup_driver()
driver.get('url1')
# Do operations with url1
driver.close()


setup_driver()
driver.get('url2')
# Do operations with url2
driver.close()

I'm not sure whether this is the only possible cause and solution, but after thorough investigation of this error which I encountered every now and then, I found the following evidences:

  1. In the log of the Selenium Grid nodes (which you can show by executing the following command on the docker host: sudo docker logs <container-id>) I found many errors reading: [SEVERE]: bind() failed: Cannot assign requested address (99). From what I read, this error usually means that there are no available ports.
  2. When showing the processes running inside a node (sudo docker exec -it bash and then ps aux), I found more than 300 instances of chrome-driver processes (you can count them using ps aux|grep driver|wc -l)

When running locally, I know that the chrome-driver process is normally invoked when you create an instance of ChromeDriver and is terminated when you call driver.Quit() (I work in C#, not Python). Therefore I concluded that some tests don't call drive.Quit().

The conclusion

In my case, I found that even though we had a call to driver.Quit() in the [TearDown] method (we use NUnit), we had some more code before that line, that could throw and exception. When one of these preceding lines threw an exception, the line that calls driver.Quit() is not reached, and therefore over time we were "leaking" chrome-driver processes on the Selenium Grid nodes. These orphan processes caused a resource leak of available ports (and probably also memory), which also caused the browser's page to crash.

The solution

Given the above conclusion, the solution was pretty straight forward. We had to wrap the code that precedes driver.Quit() in a try/finally, and put the call to driver.Quit() in the finally clause, like this:

[TearDown]
public void MyTearDown()
{
try
{
// Perform any tear down code you like, like saving screenshots, page source, etc.
}
finally
{
_driver?.Quit();
}
}
Message: unknown error: session deleted because of page crash from unknown error: cannot determine loading status from tab crashed
(Session info: headless chrome=95.0.4638.69)

This error occurred because there was not enough waiting time for web pages to load

I was having the same problem, I checked the log at which point in my script the bug happened and I added some wait, ie, time.sleep(2) just before the bug, and my problem was fixed.

The answers above solved my issue, but since i needed to run it from a docker-compose.yml i used this configuration which calls my regular unchanged DockerFile

docker-compose.yml

version: '1.0'
services:
my_app:
build:
context: .
#when building
shm_size: 1gb
#when running
shm_size: 1gb

DockerFile (selenium on Ubuntu -WSL-)

FROM python:3.10


# install google chrome
RUN wget -q -O - https://dl-ssl.google.com/linux/linux_signing_key.pub | apt-key add -
RUN sh -c 'echo "deb [arch=amd64] http://dl.google.com/linux/chrome/deb/ stable main" >> /etc/apt/sources.list.d/google-chrome.list'
RUN apt-get -y update
RUN apt-get install -y google-chrome-stable


# install chromedriver
RUN apt-get install -yqq unzip
RUN wget -O /tmp/chromedriver.zip http://chromedriver.storage.googleapis.com/`curl -sS chromedriver.storage.googleapis.com/LATEST_RELEASE`/chromedriver_linux64.zip
RUN unzip /tmp/chromedriver.zip chromedriver -d /usr/local/bin/


# set display port to avoid crash
ENV DISPLAY=:99


# install selenium
RUN pip install selenium==3.8.0




#install and prepar app


COPY ./requirements.txt ./
# COPY . /app
RUN pip3 install -r requirements.txt
RUN apt-get install -y libnss3


ENV APP_DIR=/app/my_app
RUN mkdir -p ${APP_DIR}
WORKDIR ${APP_DIR}


# COPY . ${APP_DIR} #not needed since we are mapping the volume in docker-compose


CMD [ "my_app.py" ]
ENTRYPOINT [ "python" ]