Viewing OpenTelemetry Metrics and Trace Data in Observability by Aria Operations for Applications

October 12, 2022

This post was written by Travis Keep and Sri Harsha Yayi.

Modern application architectures are complex, typically consisting of hundreds of distributed microservices implemented in different languages and by different teams. As a developer, site-reliability engineer, or DevOps professional, you are responsible for the reliability and performance of these complex systems. With observability, you can ask questions about your system and get answers based on the telemetry data it produces. Metrics data, along with Aria Operations for Applications alerts, can notify you if something needs attention. Distributed tracing can help you pinpoint the root causes for failures and identify performance bottlenecks by analyzing every request moving across services. So, how do we go about instrumenting our services to emit metrics and traces?

Aria Operations for Applications supports various instrumentation and ingestion methods for metrics and traces. Now that includes support for OpenTelemetry, a new industry standard that merges OpenTracing and OpenCensus projects to offer a complete telemetry system for monitoring distributed systems. OpenTelemetry provides a set of APIs, SDKs, and integrations for collecting and exporting metrics, traces, and logs generated by your distributed microservices so they can be monitored and analyzed with platforms such as Aria Operations for Applications. You can configure an application to send both metrics and tracing data to Aria Operations for Applications using the Wavefront Proxy

Not familiar with OpenTelemetry? Don’t worry! With OpenTelemetry, manual instrumentation is not required. Let’s explore how you can get started with OpenTelemetry and Aria Operations for Applications without manually instrumenting your Java application. In fact, we can get you there in three simple steps.

Before we get started, there are a few prerequisites:

  • You’ll need an Aria Operations for Applications account to visualize and monitor your application health. If you don’t have one already, you can sign up here

  • Docker

  • Java 11 or higher

  • Maven

Step 1: Install Wavefront Proxy

Configure your Aria Operations for Applications URL and the token. (If you’ve signed up for the free trial, here’s how you can get your token.)

1. Make sure you have Proxies permission. You can see if you have the Proxy management permission by going to https://{cluster}.wavefront.com/userprofile/groups. A checkbox next to “Proxies” indicates you have permission. If you don’t see a checkbox next to “Proxies”, your administrator needs to add this permission to your user account.

2. If you signed up for the free trial, follow these steps to get an API token.

3. Run the following command to install the proxy:

docker run -d \

   -e WAVEFRONT_URL=https://{CLUSTER}.wavefront.com/api/ \

   -e WAVEFRONT_TOKEN={TOKEN} \

   -e JAVA_HEAP_USAGE=512m \

   -e WAVEFRONT_PROXY_ARGS="--otlpGrpcListenerPorts 4317" \

   -p 2878:2878 \

   -p 4317:4317 \

   wavefronthq/proxy:11.4

Here, replace {CLUSTER} with the name of your Wavefront cluster and {TOKEN} with the API token that you generated. Note that you need at least version 11.4 of Wavefront Proxy to correctly ingest OpenTelemetry data.

4. Confirm that the proxy is running.
docker ps

If docker ps does not list the wavefront proxy, it means that the wavefront proxy stopped running. If this happens, use docker logs <container ID> to view the logs and find the issue. The docker command you ran in step 3 prints out the container ID.

Step 2: Run the auto-instrumented application

For instrumentation, you use the Java agent provided by OpenTelemetry, which can be attached to any Java application. This agent dynamically injects bytecode to collect telemetry data, and developers can avoid manual instrumentation. 

1. Clone the Spring Petclinic application and navigate to the directory.

git clone https://github.com/spring-projects/spring-petclinic.git

cd spring-petclinic

2. Run ./mvnw package from the root directory of the project.

3. Download the OpenTelemetry Java agent.

4. Assign the file path to the JAVA_AGENT variable.
JAVA_AGENT= <path to OpenTelemetry Java agent>

5. Attach the Java agent and start the Spring Petclinic application.

java -javaagent:$JAVA_AGENT -Dotel.service.name=petclinic -Dotel.resource.attributes=application=demo-petclinic -Dotel.exporter.otlp.metrics.temporality.preference=DELTA -Dotel.exporter.otlp.metrics.default.histogram.aggregation=EXPONENTIAL_BUCKET_HISTOGRAM -jar target/*.jar

6. Navigate to http://localhost:8080 and interact with the Petclinic application to generate telemetry data.

Step 3: View the metrics and distributed traces

Traces

You can view distributed tracing data in the following way:

1. In the menu bar at the top of Wavefront, go to Applications -> Traces.

2. You will see a screen similar to below:

If you go to Applications -> Application Status, you will see something similar to the screenshot below:

If you click on Petclinic and choose View Service Dashboard from the popup menu, you will see something similar to the screenshot below. The Service Dashboard provides a health overview at the service level.


Metrics

When the metric data collected from the Wavefront Proxy are sent to Aria Operations for Applications, you can examine them in the Aria Operations for Applications user interface.

For example, the query ts(jvm.threads.live) shows the total number of live threads in the Petclinic application.

This query shows the 85th percentile of HTTP response times over the last 15 minutes: cumulativePercentile(85, mavg(15m, deriv(sum(ts(http.server.duration), le)))). In this case, 85 percent of the HTTP response times are 98.75ms or less.

Note: This query might take up to 15 minutes to complete.
 

The telemetry data capabilities of VMware Aria Operations for Applications give a single, concise view into distributed systems, but it’s achievable only for instrumented applications. The good news is that OpenTelemetry provides a standard set of APIs and libraries for instrumenting our systems, and it provides the foundation for auto-instrumentation of applications across languages and frameworks.

There’s much more to the Aria Operations for Applications platform. Learn more or ask questions by joining Aria Operations for Applications

 

Previous
Secure Software Supply Chains and Developer Experience Charge VMware’s KubeCon Presence
Secure Software Supply Chains and Developer Experience Charge VMware’s KubeCon Presence

Inspired by a thriving open source community, VMware is committed to a great developer experience on Kubern...

Next
VMware Tanzu Application Platform 1.3 Improves Developer Productivity and Simplifies DevSecOps
VMware Tanzu Application Platform 1.3 Improves Developer Productivity and Simplifies DevSecOps

VMware announces new capabilities in Tanzu Application Platform that enhance developer and application oper...

×

Subscribe to our Newsletter

!
Thank you!
Error - something went wrong!