Argo CD is a declarative, GitOps continuous delivery tool for Kubernetes (OpenShift). Argo CD follows the GitOps pattern of using Git repositories as the source of truth for defining the desired application state.
To deploy ArgoCD, you can follow the ArgoCD Operator installation guide here. To ease out things for you (and me), i composed a nifty yaml that can deploy ArgoCD on an OpenShift 4 ( tested on v4.7 ) cluster. Github Repo
git clone https://github.com/ksingh7/argocd-openshift.git
cd argocd-openshift# Login as Kubeadmin# Deploy ArgoCD Operator
oc create -f 1_argocd_operator.yaml
This will create a namespace
argocd and create an…
If you need an automatic SSL/TLS certificate for free, for all your internet facing applications running on OpenShift and Kubernetes, you gotta read this.
Automatic Certificate Management Environment (ACME) protocol is a communications protocol for automating interactions between certificate authorities and their users’ web servers, allowing the automated deployment of public key infrastructure at very low cost (Source)
Tomáš Nožička developed a fantastic ACME controller for OpenShift and Kubernetes that automatically provisions certificates from Let’s Encrypt CA using ACME v2 protocol and manage their lifecycle including automatic renewals. Link to the original Github project is here
Its a simple two…
These are my quick-and-dirty brain-dump notes to myself on how to backup prometheus database running on k8s or OpenShift
oc whoami -t
oc get route -n openshift-monitoring | grep -i prometheus
curl -ks -H ‘Authorization: Bearer 0za4LjX9xPcqDjhWaufkgcQGo4grqA7ws4zvHrqgfY4’ ‘https://prometheus-k8s-openshift-monitoring.apps.ocp4.cp4d.com/api/v1/query?query=ALERTS' | python -m json.tool
curl -X ‘POST’ -ks -H ‘Authorization: Bearer 0za4LjX9xPcqDjhWaufkgcQGo4grqA7ws4zvHrqgfY4’ ‘https://prometheus-k8s-openshift-monitoring.apps.ocp4.cp4d.com/api/v2/admin/tsdb/snapshot' | python -m json.tool
“error”: “Admin APIs are disabled”,
“message”: “Admin APIs are disabled”,
In the last blog post, we will learn how to convert a keras model file into a TensorFlow (2.x) frozen graph model format
(*.pb) and its text representation
In this blog post we will learn to convert the same keras model file into TensorFlow SavedModel format
TF Saved Model Format : A single SavedModel may represent multiple graph definitions as MetaGraphDef protocol buffers. Weights and other variables usually aren’t stored inside the file during training. Instead, they’re held in separate checkpoint files
If you like to follow along, refer to
4_convert_keras_model_to_tensorflow_savedmodel_format.ipynb jupyter notebook from my repository. Below…
In part-I we learned the process of using a pre-trained keras model with separate model weight and mode configuration files. As a follow-on to that in part-II, we discussed about saving the model weight and configuration files into a single hdf5 format keras model file.
In this blog post, we will learn how to convert a keras model file into a TensorFlow (2.x) frozen graph model format
(*.pb) and its text representation
TF Saved Model Format :
saved_model.pb may represent multiple graph definitions as MetaGraphDef protocol buffers. Weights and other variables usually aren’t stored inside the file during…
If you have read the part-I blog ,you can re-collect that we had used wpod-net pre-trained model with separate model weight (*.h5) and model structure (*.json) files. Both these files were loaded into the python program in-order to detect number plates.
In this blog post we will learn how to convert (save) keras model weight and model structure files into a single Keras *.h5 file
A Keras model consists of multiple components:
Recently i was working with my friends at RHT on an interesting use case of Deep Learning. This blog series is a collection of “how to do” certain things that i learned during my ML experimentation. Here is the library of all my blogs under this series
You know tech tools are cool, but unless you have defined use-case it kinda hard to put things into perspective and understand how different tools can interact with each other, help solve a problem or explore new use cases.
So to educate and motivate our technical buyers, sellers and customers, i created a fancy use case of ingesting live twitter tweets and applying sentiment analysis to it. For this demo i used the following tools
Being an active kubernetes user for the last 3 years, I practically learned an old concept in a new way. It's very likely that being a k8s user you haven’t paid attention to or never know what an endpoint object is, however under the covers, you have been using it, full guarantee :)
One liner explanation to 2 key concepts of kubernetes
What is a Service in k8s
A service is a k8s object that exposes an application running in one or many pods as a “network service”
What is an Endpoint in k8s
An Endpoint is a k8s object that…
Learn how to move Kafka messages to Ceph S3 Object Storage using Secor
If your business is your Solar System, then your Data is the SUN, it has both gravity & mass, everything revolves around it, it must live forever — Myself
Kafka is one of the most popular messaging systems out there, used for real-time streams of data, to collect big data, or to do real-time analysis or both. Kafka is used to stream data into data lakes, applications, and real-time stream analytics systems.