OiO.lk Community platform!

Oio.lk is an excellent forum for developers, providing a wide range of resources, discussions, and support for those in the developer community. Join oio.lk today to connect with like-minded professionals, share insights, and stay updated on the latest trends and technologies in the development field.
  You need to log in or register to access the solved answers to this problem.
  • You have reached the maximum number of guest views allowed
  • Please register below to remove this limitation

Writing data to Elasticsearch 8 over NGINX reverse proxy fails when code running on Ubuntu

  • Thread starter Thread starter Aj_ruba
  • Start date Start date
A

Aj_ruba

Guest
I have an Elasticsearch 8.12 cluster on an internal network. I use self certified certificates and everything is running well, I push data into it all the time using Python, no errors and no warnings.

I also have an Nginx reverse proxy with a proper SSL certificate which I use to provide access to Kibana from outside our network.

I recently needed to ingest data into Elasticsearch from outside our network, so I thought of doing it over the reverse proxy.

I added a proxy pass to my Elasticsearch in the reverse proxy configuration file:

Code:
    location /myelasticsearch {
    proxy_pass     https://10.10.10.20:9200/;
    }

I used the same Python 3.10 code I use internally but merely replaced the URL in the code from https://10.10.10.20:9200 to https://example.com/myelasticsearch.

If I run the code on Windows11 all works well, no errors and no warnings. If I run the same code on Ubunu 22.04 with Python 3.10 I get the following error: elastic_transport.TlsError: TLS error caused by: SSLError([SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007)).

I have checked and checked the code, and the certificate. I tried writing to a different server in the cluster (after I change the proxy pass of course). The same results, the code runs with no errors on Windows, and gives the same error on Ubuntu.

I have tried the same code on WSL (Ubuntu 22.04) and I keep getting the same error. I did lots of searches on the Internet and tried different things, I downgraded urllib to 1.24, although Elasticsearch complained, I still got the same error.

I changed the Elasticsearch connection code from:

Code:
    elasticsearch = Elasticsearch([server_id], basic_auth=(user_name, elastic_pass), verify_certs=True, ca_certs=certificate_fullpath)

to :

Code:
    context = create_default_context(cafile=certificate_fullpath) 
    context.check_hostname = False
    context.hostname_checks_common_name = False
        
    elasticsearch = Elasticsearch([server_id], ssl_context=context, basic_auth=(user_name, elastic_pass))

Same issue, it works when the code is running on Windows and fails when the code is running on Ubuntu.

I added various thing to the Nginx configuration, but to no avail.

If I change verify_certs=True to verify_certs=False in the Elasticsearch connection, then data gets pushed into Elasticsearch when the code is running on Ubuntu but I get a warning that the connection is insecure.

Has anyone come across this and have they been able to solve it?

Thanks.
<p>I have an Elasticsearch 8.12 cluster on an internal network. I use self certified certificates and everything is running well, I push data into it all the time using Python, no errors and no warnings.</p>
<p>I also have an Nginx reverse proxy with a proper SSL certificate which I use to provide access to Kibana from outside our network.</p>
<p>I recently needed to ingest data into Elasticsearch from outside our network, so I thought of doing it over the reverse proxy.</p>
<p>I added a proxy pass to my Elasticsearch in the reverse proxy configuration file:</p>
<pre><code> location /myelasticsearch {
proxy_pass https://10.10.10.20:9200/;
}
</code></pre>
<p>I used the same Python 3.10 code I use internally but merely replaced the URL in the code from <a href="https://10.10.10.20:9200" rel="nofollow noreferrer">https://10.10.10.20:9200</a> to <a href="https://example.com/myelasticsearch" rel="nofollow noreferrer">https://example.com/myelasticsearch</a>.</p>
<p>If I run the code on Windows11 all works well, no errors and no warnings. If I run the same code on Ubunu 22.04 with Python 3.10 I get the following error:
elastic_transport.TlsError: TLS error caused by: SSLError([SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007)).</p>
<p>I have checked and checked the code, and the certificate. I tried writing to a different server in the cluster (after I change the proxy pass of course). The same results, the code runs with no errors on Windows, and gives the same error on Ubuntu.</p>
<p>I have tried the same code on WSL (Ubuntu 22.04) and I keep getting the same error. I did lots of searches on the Internet and tried different things, I downgraded urllib to 1.24, although Elasticsearch complained, I still got the same error.</p>
<p>I changed the Elasticsearch connection code from:</p>
<pre><code> elasticsearch = Elasticsearch([server_id], basic_auth=(user_name, elastic_pass), verify_certs=True, ca_certs=certificate_fullpath)
</code></pre>
<p>to :</p>
<pre><code> context = create_default_context(cafile=certificate_fullpath)
context.check_hostname = False
context.hostname_checks_common_name = False

elasticsearch = Elasticsearch([server_id], ssl_context=context, basic_auth=(user_name, elastic_pass))
</code></pre>
<p>Same issue, it works when the code is running on Windows and fails when the code is running on Ubuntu.</p>
<p>I added various thing to the Nginx configuration, but to no avail.</p>
<p>If I change verify_certs=True to verify_certs=False in the Elasticsearch connection, then data gets pushed into Elasticsearch when the code is running on Ubuntu but I get a warning that the connection is insecure.</p>
<p>Has anyone come across this and have they been able to solve it?</p>
<p>Thanks.</p>
 
Top