Event Portal 1.0

Event Portal is an event management tool—presented through an accessible Web user interface (UI)—that enables you to design, discover, visualize, share, and manage various aspects of your event-driven architecture (EDA).

A key benefit of using Event Portal is its ability to track the relationships that exist in an extremely decoupled event-driven architecture (EDA). It enables the reuse of schemas and events, and graphically presents the relationships that exist between applications and events.

Furthermore, as an Event Portal user, you can model your event-driven architecture (EDA) in different operational environments. Event Portal allows for environment separation, in the same way how the event broker services are separated within enterprises. This means you can create a different cloud console or Event Portal account for each environment within enterprise. You can then create and grant different user access permissions for each of these environments, and use that to model your EDAs as they progress from one environment to the next.

To get started, let's first understand the Foundational Elements of Event Portal.

Foundational Elements of Event Portal

When designing an event-driven architecture (EDA), it is essential to model the enterprise or interworking systems as a whole, which means dividing it into smaller, more manageable pieces. Once broken down into what the Event Portal defines as an Application Domain, within each application domain, you can create a set of event-driven entities or objects (schemas, events and applications) which represent the runtime interactions.

The following are the four foundational elements of the Event Portal:

Schemas, Events, and Applications are often referred to as objects in Event Portal.

Application Domain

An application domain represents a namespace where applications, events, and schemas can live. Application domains are useful to decompose the enterprise and create organizational boundaries within which teams can work. Within the application domain, you can create a suite of applications, events and schemas that are isolated from other application domains. This provides a way to create independent event-driven architectures for different products. You can also share events and schemas across application domains enabling the integration of events and data between different organizational groups, in a controlled and governed manner. The use of one or more application domains provides a way to organize and create event-driven architectures for different teams, groups or lines of business within the organization. Organizations should determine the best use of application domains based on their organizational structure and architectural goals.

An application, event and schema can only belong to one application domain for organizational ownership reasons. An event and schema can be marked as “shared” to enable their use across application domain boundaries. This enables the users of the application domain to more strictly govern those events that are exposed externally to their application domain while maintaining agility in terms of the ability to make changes to those events which are contained within the application domain.

Furthermore, application domains define a topic domain. For all the events in the application domain, the topics defined for these events should be named with the topic domain at the beginning of the topic address. For example, if a topic domain is defined as solace/weather, then an event in that application domain may be named solace/weather/blizzard/started.

Schema

In simple terms, a schema represents the contract to describe the payload of an event. Producers and consumers of an event can trust that the event's payload matches the schema definition assigned to that event. Schemas define a type of payload through JSON, AVRO, XML, Binary, or Text. JSON, AVRO, and XML schemas have content that describes each property of the schema. The content is either in JSON or AVRO Schema format, or XSD/DTD format.

Furthermore, you can search and view JSON schemas in a more human-readable format along with the actual schema source. This makes it easy for users to easily read, browse, and understand the schema contents without going through the source.

Event

The event represents a business moment or an action that can be communicated with zero or more interested applications. The event is where you define metadata that describes and categorizes the event. An event is produced on a specific topic that must be defined for that event. From a modelling perspective, events reference payload schemas, and events are referenced by applications, thus forming the glue that binds applications together.

It is important to note that an event (or an event type), as referred to in the Event Portal is different from an event instance. For more information on this topic, refer to Event Type vs Event Instance.

Application

An application represents a piece of software that produces and consumes events. Applications connect to the event broker in an event-driven architecture and communicate with other applications via events. A single application represents a class of applications that are running the same code base; therefore, there is no need to create multiple applications for each instance in a cluster.

When creating an application, you can specify the Application Type: Standard or Kafka Connector.

  • Standard: A standard application that connects to an event mesh.
  • Kafka Connector: A Kafka Connector is used for moving data into Kafka from other systems and out of Kafka into other systems.

SAP Event Mesh to Kafka Connectors

A connector is used in Kafka for connecting Kafka brokers with external systems to stream data into or out of Apache Kafka. In the Event Portal, a Kafka Connector is an application class you select to configure associated published and/or subscribed events and a set of Kafka-native attributes.

There are two sets of SAP to Kafka connectors. Both these sets are comprised of a Source connector and a Sink connector to support bi-directional flow, although both connectors are not required. .

SAP-Kafka Sink Connector
The SAP-Kafka sink connector consumes Kafka records and publishes them as SAP events.
The following objects and relationships are discovered:
advanced event mesh
Logical Event Mesh
the topics and queues that the events are published to
Kafka
Logical Event Mesh
topics
schemas
the sink connector
consumer group for the connector (along with the associated topic subscriptions)
SAP-Kafka Source Connector
The Solace-Kafka source connector consumes Kafka records and publishes them as Solace events.
The following objects and relationships are discovered:
SAP
Logical Event Mesh
the topics and queues (and associated subscriptions) used as input to the connector
connector client (and associated subscriptions)
Kafka Discovery
Logical Event Mesh
topics
schemas
the connector (along with the list of topics the connector is publishing to)

To learn more about Kafka connector, refer to the discussion on connectors in Kafka's documentation. Refer to Adding Connectors to an Application Domain to learn how it's supported in the Event Portal.

Event Portal Tools

Use these tools to create, design, manage, discover, visualize, and share all events within your ecosystem.

Discovery: Discover and visualize events your event driven architecture from your event brokers. Initial support is for discovery and the import of events, schemas, and application interactions from Apache Kafka, Confluent, and Amazon MSK. Support for additional brokers types will follow.

Designer: Design and view all aspects of your event-driven architecture.

Catalog: Browse for events, schemas, and applications defined in your environment using a searchable, sortable, and filterable interface.

Event API Products (Beta) : Bundle your events and make them available as Event API Products.

Logical Event Mesh

A Logical Event Mesh (LEM) represents the event mesh over which the associated published and subscribed events flow within an event-driven architecture. For event broker services, a LEM represents a Message VPN or a set of Message VPNs connected via DMR or MNR links. For Kafka brokers a LEM represents a Kafka broker cluster. Each LEM has a set of topic addresses which are registered as published and/or subscribed to by applications connected to that event mesh. Based on the LEM, users can see the events that each application client delivery endpoint will attract through their respective subscriptions and therefore, know the events the application will be consuming in runtime, and how they will be consumed.

An organization may have one or more LEMs of the same event broker type and/or different broker types. An application may use multiple LEM at the same time; for example, an application could be using a LEM for SAP event and another for Kafka event. It's important to remember that a LEM must have a specific broker type for the event routing behaviour to be correctly modelled in Event Portal. This allows for the associated Client Delivery Endpoint to be tailored to those specific broker types and constructs. For example, you can use consumer groups for Kafka CDEs, and Event Queues for SAP CDEs.

The scope of a LEM for the following event broker types is as follows:

SAP

For SAP event broker services, it represents the model of a Message VPN or a set of DMR or MNR meshed Message VPNs (an event mesh). The topic level delimiter is always a forward slash symbol (/).

Kafka

For Kafka, it represents a cluster of event brokers. That is, the mesh over which the associated published and subscribed events flow. If you have more than one Kafka Cluster, you will have one Logical Event Mesh per Cluster. The topic level delimiter is configurable and can be set to whatever the organization has decided to use. Dots (.) and Underscores (_) are commonly used.

Topic Tree

Topic Tree represents all topic addresses that will be published within a Logical Event Mesh (LEM). The scope for a topic address is limited to the LEM in which they are targeted or published. For SAP events, you can only associate one event per topic address. In other words, it cannot be published to multiple Logical Event Meshes. To model this, you will need to create a separate event with a slightly different name and the same topic address with a different associated target Logical Event Mesh. However, for Kafka-based events, you can have multiple events with the same topic address but different schemas.

The primary purpose of a topic tree is to enable the system, and therefore users to evaluate Client Delivery Endpoint subscriptions and determine the topic addresses (and therefore which events) the associated consuming applications will attract. The Topic Tree inherits the topic delimiter from the Logical Event Mesh.

An organization should first define a topic hierarchy (refer to our Topic Architecture Best Practices guide) and then create the topic addresses based on the topic hierarchy. These addresses are placed within a bounded context of a LEM against which subscriptions can be evaluated.

Topic Address

Topic Address is a set of Topic Level that are used to construct a topic for a published event. The events defined in the Event Portal must have a topic address for the runtime event broker to route the events to interested parties.

Based on the topic address, a Client Delivery Endpoint (CDE) receives the published event depending on the CDE's subscriptions. There can be zero or more CDEs with subscriptions that match an event published with the Topic Address.

An event is only available for consumption within a Logical Event Mesh if it fulfills the following requirements:

  • it is targeted to a Logical Event Mesh
  • a topic address has been created for it

Once the topic address is created, it will be added to the Topic Tree and evaluated against the Logical Event Mesh subscriptions. All the appropriate event to application relationships are automatically determined and updated.

Topic Level

A topic level is a component of a topic address that can be a literal value or a variable. A variable is a placeholder that is substituted for a concrete value by publishing applications at runtime. A variable type topic level can be bounded (with a defined value set) or it can be unbounded.

If a topic level is static, then you can use a literal topic level to represent it. However, you could also define an enum with a specific set of values that do not have to be replaced with constrained values by applications at runtime. Whether to use literal or variable is up to the person designing the topic address.

Topic

Topics are a means of classifying information, and in practice they're simply strings that are composed of one or more levels. For more information, refer to Understanding Topics.

Topic Subscriptions

A Topic subscription is a string used to attract events that are published on an event mesh. A topic subscription can contain wildcards, used to match multiple topics or hierarchies of topics. Wildcard characters and matching behavior vary between messaging protocols.

Client Delivery Endpoints

A Client Delivery Endpoint (CDE) represents the location on an Event Mesh that is used by an application to consume events. For this to happen, the following must be defined in the CDE:

  • a Logical Event Mesh for the CDE must be specified
  • the type of the CDE must be specified
  • the CDE must have subscriptions to consume events
  • a CDE can only be associated with an Application and each CDE can only be associated once with an Application

A CDE is scoped to a single Logical Event Mesh, however, an application may have relationships with zero or more CDEs. In the case where the CDE represents a physical endpoint (a queue or topic endpoint; for example, in event broker service), the name of the endpoint is used for the CDE.

The following CDEs are currently supported in Event Portal:

Event Queue
An Event Queue is similar to a Queue, but it's specifically used for pub/sub message exchange pattern. It does not support request/reply or point-to-point message exchange patterns.
Durable Topic Endpoint
Durable topic endpoints are provisioned objects on the event broker that have a life span independent of a particular client session.
Direct Client Endpoint
Direct Client Endpoints are client connections with one or more subscriptions. It is a essentially a direct messaging client—a consuming or subscribing client without a queue or topic endpoint.
Consumer Group (Kafka only)
Client username and client name is the same as the event brokers, don't apply to Connector Groups as that concept doesn't exist in Kafka. Event Portal supports the concept of Kafka's consumer groups.
A consumer group is used by Kafka to group consumers into a logical subscriber for a topic. In Event Portal, you can model consumer groups in Designer. This enables the Event Portal's runtime discovery to associate a discovered consumer group to an existing application. Kafka consumers that belong to the same consumer group share a group ID. The consumers in a group divide the topic partitions, as fairly as possible, so that each consumer consumes only a single partition from the group.
To learn how you can set up consumer groups in the Event Portal, see Mapping Consumer Groups to Applications. You can also visit the Kafka's documentation to learn more about consumer groups.

Event Type vs Event Instance

An event type (or event as it is referred to in Event Portal) represents a class of events produced in an event-driven architecture. The event type is made up of its topic and schema that represents the allowed payload for the event.

An event instance is a specific instance of an event that is produced. An event instance has an event type. It conforms to the schema of the event type and is produced on a topic defined for the event type. Over the lifecycle of an application, many event instances are produced and consumed.

Shared Events and Schemas

By default, you can only reference events and schemas within their own application domain. For another application domain to reference your events and schemas, the event or schema must be marked as shared. For example, imagine a situation where there are two application domains modeled in Event Portal: the Weather Events and the Traffic Events. Say, an event is created in the Weather Events application domain called the Blizzard. Since the Traffic Events application domain should be able to consume the Blizzard event, the creator of the Blizzard marks the event as shared. Now when an application is created in the Traffic Event application domain, it can choose to publish or subscribe to the Blizzard event. If it were not marked as shared, it would not be allowed to reference by applications in the Traffic Events application domain.

Discovery Agent

You can use the Discovery Agent to run a Discovery scan against your event broker. The agent will connect to your event broker to scan your event-driven architecture using a specified set of topic subscriptions. The gathered data is generated as a JSON file (Discovery file), which you can upload into Event Portal.

The following event brokers are currently supported: Apache Kafka, Confluent, Amazon MSK, and SAP event broker services event broker services.

You can install the agent locally and use it to scan your EDA and generate a Discovery file. To install the agent or learn how to run a Discovery scan, refer to Installing the Discovery Agent .

Information Discovery Agent Captures and Uploads

The information the agent gathers during a Discovery scan depends on the event broker type.

SAP Event Broker Service

The following data/metadata is returned in the Discovery scans:

Information Gathered By Agent Description
Broker

Theevent broker service information taken from the input data specified by the user, which is also uploaded in the Discovery file.

"broker": {
"	brokerType": "solace",
	"hostname": "mySolaceHost.solace.com"
	"additionalAttributes": {
"	vpnName": "myVpnName" 
	}
}
Clients

An active or inactive messaging client in the event broker. A client has a top level attribute, which is used to indicate of the client is a consumer or his client type implies message flow direction in the context of relationships with channel objects.

Te following client type information is captured:

  • clientType: Client Application
  • clientUserName: myClientUsername
  • clientProfleName: myClientProflleName
  • aclProfileName: myAclProfileName
Channels

A channel represents the event broker entities that are responsible for the transmission and storage of messages.

The channel types include:

  • Topics
  • Queues
  • Topic Endpoints
Subscriptions

Subscriptions represents a potentially unbounded “channel space” described by a set of criteria (topic strings with wildcards, routing keys, etc.)

Subscription types

  • client to topic subscription
  • queue to topic subscription
Object Relationships

The object relations represent relationship between two or more to-level objects.

The following relationships are captured:

  • clientToSubscriptionRelationship
  • channelToSubscriptionRelationship
  • clientToChannelRelationship
  • channelToChannelRelationship

Kafka Broker

The following data/metadata is returned in the Discovery scans:

Information Gathered By Agent Description
Topics A list of topic names.
Connectors

A list of connectors with the following configuration data:

  • connector class
  • connector type
  • maximum thread allocation
Consumer Groups

A list of consumer groups with a flag indicating if they are simple consumers.

Schemas A list of schemas. In the case of a string schema (non-json format), the contents of the message will be included.
Object Relationships

The object relations represent relationship between two or more to-level objects.

The following relationships are captured:

  • connectorToTopicAssociations
  • consumerGroupToTopicAssociations
  • topicToSchemaAssociations

Primitive Types

Some message payloads consist of primitive types such as String and Numbers, rather than more complex structures (JSON or XML). Event Portal supports primitive event types so that you can model your events (for example, Kafka topics) with payloads that are simple primitives. Primitive types for message payloads are automatically detected in the Discovery and can be committed or imported into Designer. Likewise, you can also create and edit an event to use a primitive type in the Designer. Using AsyncAPI, you can generate an AsyncAPI specification for the application associated with an event with primitive payloads.

The following primitive types are currently supported in Event Portal:

  • Avro: Null, Boolean, Int, Long, Float, Double, Byte, String
  • JSON: Null, Boolean, Number, String

To learn how to use a Primitive Type when designing an event, see Managing Events.

Multiple Record Types

It is common for Kafka-based applications to publish events to the event broker on the same topic that does not share the same schema. In Event Portal, you can discover, re-discover, and stage different topics with more than one record type. In other words, topics that have multiple records with different payload schemas are discovered in Discovery. The payload schema could be JSON, Avro, XML, text, or Binary. Discovered events for the same topic are uniquely named to differentiate each event before committing them into the Designer as events. Once committed, the topic to schema association occurs in the backend, which can then be visualized in Designer.

To learn how to discover and commit your EDA into Event Portal, refer to Discovery.

Tags

Applications, events and schemas all have the option to have tags associated with them. Searching for a tag name in Catalog will find all the objects associated with that tag.

Tags can be a great way to share information with other members of your organization by grouping sets of objects together. Here are some examples of how you can use tags:

  • Create a tag called Java and set it against all of your Java-based applications
  • Create a tag called User and tag all of your schemas related to users
  • Create a tag called Sign in Flow and tag the series of events and applications that are involved in the sign in flow for your application

In the future release of the Event Portal, Designer will have the ability to filter applications and events based on the tags applied to them. This will give another layer of customization to visualize your event-driven architectures.

Revision History

Revision history track the changes to objects in Event Portal. Any time an application, event or schema is updated, it creates a new revision. For schemas, you can perform additional tasks such as creating multiple versions of a particular schema and advanced version control management. Refer to Schema Versions and Version Control for more information.

You can restore an old revision of an object at any point. The act of restoring an object to a previous revision creates a new revision of that object. For example, an object whose current revision is at Rev 2, when restored to Rev 1, will result in it having three revisions. The new Rev 3 will be the same as Rev 1 since it was restored from it.

There are few exceptions that you should consider when using the revision history of an object:

  • Owner and tag information of an object is not included in the revision history. Changing the owners or tags of an object will not create a new revision of the object.
  • Restoring to a revision of an object whose associations have changed will not change the associated objects. For example, imagine an event and schema that have both undergone ten revisions. At the time of the second revision of the event, the schema was on its first revision. Reverting the event to its second revision will not return the schema to its first revision, it will continue to be its tenth revision.
  • Application domains do not track revision history. Instead, if you need to revert an application domain and all of its contained objects to a point in time, it is recommended to use the import/export functionality.

For tutorials, refer to Managing Object Revisions.

Schema Versions and Version Control

You can create schemas with our without versions. When using schema versions, you can create and store multiple versions of the same schema, which can be used in your EDAs that require two or more versions of a particular schema. Revision History is support with schema version control, such as saving revisions of each schema, viewing changes in the revisions, and reverting to an older revision.

For additional information and tutorials, see Managing Schemas .

Archive Objects and Revisions

When you delete an object, it will be archived for ninety days, along with its associated revisions. You can restore the deleted object and its associated revisions anytime within the ninety days.

To learn more, refer to Deleting an Object.

Role-Based Access Control (RBAC)

Event Portal supports the ability to apply access controls to application domains through Designer. Event Portal's role-based access control (RBAC) helps you manage who has access to applications domains and their resources and what they can do with it. For example, enterprise architects could be responsible for the entire event-driven architecture (EDA) and its assets, and therefore require Administrator or Event Portal Manager role to view and modify all application domains. On the other hand, application developers may be responsible for specific parts of the EDA, and therefore may only require access to view or edit one or more specific application domains. In that case, they can be assigned Event Portal User role with Viewer, Editor or Manager level access to specific application domains.

For more information on Event Portal RBAC, refer to Managing Application Domains.

AsyncAPI

AsyncAPI is an open-source initiative that seeks to improve the current state of Event-Driven Architectures (EDA). Using AsyncAPI allows development teams to create applications that communicate asynchronously through events more easily. The core of AsyncAPI is a specification that describes how to create a document that explains the inner workings of an event-driven API.

You can use an AsyncAPI specification for many functions, such as:

  • Generating documentation
  • Generating code
  • Validating events received by your application
  • Applying API management policies

Learn more about AsyncAPI on their website at https://www.asyncapi.com.

AsyncAPI and the Event Portal

Event Portal natively supports the AsyncAPI 2.0.0 specification and applications can be exported into an AsyncAPI document. You can export applications in JSON and YAML; the two supported formats of AsyncAPI.

To learn how to generate an AsyncAPI document for an application, refer to Generating an AsyncAPI.

REST API

Event Portal provides a RESTful API that you can use to manage your data in the advanced event mesh. Use the REST API to integrate other applications, systems, or client applications with Event Portal, and model or retrieve your event-driven architectures from your own client applications.

For more information, refer to the Event Portal v1 REST API documentation.

Related Topics