Configuring TLS/SSL Encryption for CDH Services
This section describes how to configure encryption for CDH services (HDFS, MapReduce, YARN, HBase, Hive, Impala, Hue and Oozie) focusing on SSL.
- Configuring TLS/SSL for HDFS, YARN and MapReduce
- Configuring TLS/SSL for HBase
- Configuring TLS/SSL for Flume Thrift Source and Sink
- Configuring Encrypted Communication Between HiveServer2 and Client Drivers
- Configuring TLS/SSL for Hue
- Configuring TLS/SSL for Impala
- Configuring TLS/SSL for Oozie
- Configuring TLS/SSL for Solr
- Spark Encryption
- Configuring TLS/SSL for HttpFS
- Encrypted Shuffle and Encrypted Web UIs
- Cloudera recommends securing a cluster using Kerberos authentication before enabling encryption such as TLS/SSL on a cluster. If you enable TLS/SSL for a cluster that does not already have Kerberos authentication configured, a warning will be displayed.
- The following sections assume that you have created all the certificates required for TLS/SSL communication. If not, for information on how to do this, see Creating Certificates.
- The certificates and keys to be deployed in your cluster should be organized into the appropriate set of keystores and truststores. For more information, see Creating Java Keystores and Truststores.
Hadoop Services as TLS/SSL Servers and Clients
- HDFS, MapReduce, and YARN daemons act as both TLS/SSL servers and clients.
- HBase daemons act as TLS/SSL servers only.
- Oozie daemons act as TLS/SSL servers only.
- Hue acts as an TLS/SSL client to all of the above.
Compatible Certificate Formats for Hadoop Components
|Component||Compatible Certificate Format|
|Hive (for communication between Hive clients and HiveServer2)||Java Keystore|