Argo CD is a declarative, GitOps continuous delivery tool for Kubernetes (OpenShift). Argo CD follows the GitOps pattern of using Git repositories as the source of truth for defining the desired application state.

To deploy ArgoCD, you can follow the ArgoCD Operator installation guide here. To ease out things for you (and me), i composed a nifty yaml that can deploy ArgoCD on an OpenShift 4 ( tested on v4.7 ) cluster. Github Repo

git clone https://github.com/ksingh7/argocd-openshift.git
cd argocd-openshift
# Login as Kubeadmin# Deploy ArgoCD Operator
oc create -f 1_argocd_operator.yaml

This will create a namespace argocd and create an…


These are my quick-and-dirty brain-dump notes to myself on how to backup prometheus database running on k8s or OpenShift

  • Get Token for API Authentication and Prometheus API Route URL
oc whoami -t
oc get route -n openshift-monitoring | grep -i prometheus
  • Run sample curl request

curl -ks -H ‘Authorization: Bearer 0za4LjX9xPcqDjhWaufkgcQGo4grqA7ws4zvHrqgfY4’ ‘https://prometheus-k8s-openshift-monitoring.apps.ocp4.cp4d.com/api/v1/query?query=ALERTS' | python -m json.tool
  • Create TSDB Snapshot
curl -X ‘POST’ -ks -H ‘Authorization: Bearer 0za4LjX9xPcqDjhWaufkgcQGo4grqA7ws4zvHrqgfY4’ ‘https://prometheus-k8s-openshift-monitoring.apps.ocp4.cp4d.com/api/v2/admin/tsdb/snapshot' | python -m json.tool
  • You might get an error, so you first need to enable Admin APIs
{
“error”: “Admin APIs are disabled”,
“message”: “Admin APIs are disabled”,
“code”: 14
}
  • Enable…

4 Part blog series : Documenting the learnings of my experimentation on Object Detection using Keras, Tensorflow

https://docs.w3cub.com/

In the last blog post, we will learn how to convert a keras model file into a TensorFlow (2.x) frozen graph model format (*.pb) and its text representation (*.pbtxt) .

In this blog post we will learn to convert the same keras model file into TensorFlow SavedModel format *.pb

Introduction

TF Saved Model Format : A single SavedModel may represent multiple graph definitions as MetaGraphDef protocol buffers. Weights and other variables usually aren’t stored inside the file during training. Instead, they’re held in separate checkpoint files

Implementation

If you like to follow along, refer to 4_convert_keras_model_to_tensorflow_savedmodel_format.ipynb jupyter notebook from my repository. Below…


4 Part blog series : Documenting the learnings of my experimentation on Object Detection using Keras, Tensorflow

In part-I we learned the process of using a pre-trained keras model with separate model weight and mode configuration files. As a follow-on to that in part-II, we discussed about saving the model weight and configuration files into a single hdf5 format keras model file.

In this blog post, we will learn how to convert a keras model file into a TensorFlow (2.x) frozen graph model format (*.pb) and its text representation (*.pbtxt) .

Introduction

TF Saved Model Format : saved_model.pb may represent multiple graph definitions as MetaGraphDef protocol buffers. Weights and other variables usually aren’t stored inside the file during…


4 Part blog series : Documenting the learnings of my experimentation on Object Detection using Keras, Tensorflow

Photo by Tamanna Rumee on Unsplash

If you have read the part-I blog ,you can re-collect that we had used wpod-net pre-trained model with separate model weight (*.h5) and model structure (*.json) files. Both these files were loaded into the python program in-order to detect number plates.

In this blog post we will learn how to convert (save) keras model weight and model structure files into a single Keras *.h5 file

Introduction

A Keras model consists of multiple components:

  • The architecture or configuration, which specifies what layers the model contain, and how they’re connected, typically in jsonor YAML format.
  • A set of weights values (the “state of…

4 Part blog series : Documenting the learnings of my experimentation on Object Detection using Keras, Tensorflow

License Plate Detection using a pre-trained Keras Model
License Plate Detection using a pre-trained Keras Model
License Plate Detection using a pre-trained Keras Model

Recently i was working with my friends at RHT on an interesting use case of Deep Learning. This blog series is a collection of “how to do” certain things that i learned during my ML experimentation. Here is the library of all my blogs under this series

Introduction

I used a pre-trained model called wpod-net for detecting license plate. I…


You know tech tools are cool, but unless you have defined use-case it kinda hard to put things into perspective and understand how different tools can interact with each other, help solve a problem or explore new use cases.

So to educate and motivate our technical buyers, sellers and customers, i created a fancy use case of ingesting live twitter tweets and applying sentiment analysis to it. For this demo i used the following tools

  • Twitter API : Realtime streaming data source
  • Red Hat AMQ Streams : Apache Kafka cluster to store real time streaming data coming into the system


Photo by Andre A. Xavier

Being an active kubernetes user for the last 3 years, I practically learned an old concept in a new way. It's very likely that being a k8s user you haven’t paid attention to or never know what an endpoint object is, however under the covers, you have been using it, full guarantee :)

One liner explanation to 2 key concepts of kubernetes

What is a Service in k8s
A service is a k8s object that exposes an application running in one or many pods as a “network service”

What is an Endpoint in k8s
An Endpoint is a k8s object that…


Learn how to move Kafka messages to Ceph S3 Object Storage using Secor

If your business is your Solar System, then your Data is the SUN, it has both gravity & mass, everything revolves around it, it must live forever — Myself

Introduction

Kafka is one of the most popular messaging systems out there, used for real-time streams of data, to collect big data, or to do real-time analysis or both. Kafka is used to stream data into data lakes, applications, and real-time stream analytics systems.


Red Hat OpenShift installer by default uses self-signed certificates to encrypt the communication with the web console as well as applications exposed via OpenShift Route. Self-signed certs generally suffice your dev/test environments however, for production environments, it's highly recommended to use proper certificates to secure all your OpenShift routes.

In this post, you will learn how to request TLS certificates from Let’s Encrypt and apply those to your OpenShift 4 cluster as a post-installation step.

Prerequisite

  • Up and running OpenShift4 cluster
  • Registered domain name with access to DNS management (see supported DNS providers here)

Get .. Set .. Go !!

  • Clone acmesh-official repository
cd $HOME
git clone https://github.com/acmesh-official/acme.sh.git
cd…

Karan Singh

Sr. Solution Architect @ Red Hat ♦ Loves Kubernetes, Storage, Serverless, Hybrid-Multi-Cloud, Software Architectures, DevOps, Data Analytics & AI/ML

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store