Middleware
Connect Systemiq with external services using Middleware.
Systemiq Middleware acts as a local data processing hub. It collects, validates, and processes structured data from local Publisher scripts via a secure gRPC connection before forwarding it to Systemiq through a secure, external gRPC connection. This ensures that only validated data reaches your analytics and automation pipelines.
How Middleware Works
- The Publisher: Gathers and structures data locally, then sends it to Middleware via a secure, local gRPC connection.
- The Middleware: Listens for incoming data, processes and validates it, and forwards the refined data to Systemiq through a secure, external gRPC connection.
- Communication Flow:
- Publisher → Middleware: Local gRPC connection.
- Middleware → Systemiq: External gRPC connection.
Important: Middleware and Publisher typically run within a secure, isolated network that restricts access to sensitive data resources (e.g., data lakes, data warehouses). This dedicated layer integrates seamlessly with your existing infrastructure without impacting current processes.
docker pull systemiq/middleware:latest
docker run -d -p 50051:50051 --name systemiq-middleware systemiq/middleware
Use these commands to pull the latest Middleware image and run it locally.
Note: Middleware is not a plug-and-play solution. Systemiq will provide the necessary configuration and access credentials.
import logging
import os
import time
from publisher_service import Publisher
# Unique identifier for the dataset
ELEMENT_ID = 10
# Connect to Middleware's local gRPC endpoint
OBSERVER_ENDPOINT = os.getenv("OBSERVER_ENDPOINT", "localhost:50051")
publisher = Publisher(observer_endpoint=OBSERVER_ENDPOINT)
def fetch_data():
"""
Simulated data collection function.
Replace with actual logic to gather data.
"""
return {
"timestamp": int(time.time()),
"value": 42.5, # Example data value
}
def main():
logging.basicConfig(level=logging.INFO)
while True:
try:
data = fetch_data()
logging.info(f"Publishing data to Middleware: {data}")
# Send a batch of data to Middleware for processing
publisher.batch(ELEMENT_ID, [data], "process")
except Exception as e:
logging.error(f"Error publishing data: {e}")
time.sleep(10) # Publish data every 10 seconds
if __name__ == "__main__":
main()
In this example, a custom Publisher collects real-time data and sends it to Middleware. The Middleware processes and validates the data before forwarding it to Systemiq.
Note: While the example above is implemented in Python, our integrations support multiple programming languages. If you need examples or assistance in another language, please reach out. For more details and community contributions, visit our GitHub repository.