https://github.com/uniteeio/Unitee.EventDriven
Unitee.EventDriven is library to deal with Event Driven Programming (EDP) in a distributed environment.
dotnet add package Unitee.EventDriven.RedisStream
For now, we mainly focus on Redis as an event store because:
- Easy to deploy or find free (cheap) clusters
- Easy to visualize with a gui tool
- A tool you may already familiar with (for caching for example)
- Builtin system for pub/sub and storing streams
- Good .NET integration
- Publishing distributed messages
- Subscribe to distributed messages
- Request/Reply pattern
- Scheduling messages
- Treat pending messages at start
- Recurring task (cron)
- Localized messages
- Use the package
StackExchang.Redis
to make theIConnectionMultiplexer
available in the DI container.
var multiplexer = ConnectionMultiplexer.Connect(builder.Configuration["Redis:ConnectionString"]);
builder.Services.AddSingleton<IConnectionMultiplexer>(multiplexer);
- Create an event as a POCO object.
[Subject("USER_REGISTERED")]
public record UserRegistered(int UserId, string Email);
If the subject is ommited, the name of the object is used instead (here, UserRegistered
)
builder.Services.AddRedisStreamPublisher();
Use the IRedisStreamPublisher
to actually publish the event:
[ApiController]
public class UserController : ControllerBase
{
private readonly IRedisStreamPublisher _publisher;
private readonly IUserService _userService;
public UserController(IRedisStreamPublisher publisher, IUserService userService)
{
_publisher = publisher;
_userService = userService;
}
public async Task<IActionResult> Register(string email)
{
var userId = _userService.CreateUserInBdd();
await _publisher.PublishAsync(new UserRegistered(userId, email));
return Ok();
}
// Request a reply
public async Task<IActionResult> ForgotPassword(string email)
{
try
{
var response = await _publisher.RequestResponseAsync(new PasswordForgotten(email));
return Ok();
}
catch (TimeoutException)
{
return NotFound();
}
}
// Schedule
public async Task<IActionResult> Register(string email)
{
await _publisher.PublishAsync(new UserRegistered30MinutesAgo(email), new()
{
ScheduledEnqueueTime = DateTime.UtcNow.AddMinutes(30);
});
return Ok();
}
}
You need to register a RedisStreamBackgorundReceiver
:
services.AddRedisStreamBackgroundReceiver("ConsumerService");
Implementation detail: The name is used to create consumer groups. A message is delivered to all the consumer groups. (one to many communication).
You also need to create a class that implements: IRedisStreamConsumer<TEvent>
public class UserRegisteredConsumer : IRedisStreamConsumer<UserRegistered>
{
public async Task ConsumeAsync(UserRegistered message)
{
await _email.Send(message.Email);
}
}
Then, register your consumer:
services.AddTransient<IConsumer, UserRegisteredConsumer>();
All consumers should be added using AddTransient
. So, they all have their own scope since they are executed concurrently.
If you want to your consumer to be able to reply or access metadata of the message, then, implement IRedisStreamConsumerWithContext<TRequest, TResponse>
instead.
public class UserRegisteredConsumer : IRedisStreamConsumeWithContext<UserRegistered, MyResponse> // Use object or anything if you didn't plan to respond to the message.
{
public async Task ConsumeAsync(UserRegistered message, IRedisStreamMessageContext context)
{
_logger.LogInformation(context.Locale);
await _email.Send(message.Email);
await context.ReplyAsync(new MyResponse());
}
}
If a consumer throw, then the message and the exception are published to a special queue named: dead letter queue.
The default name is DEAD_LETTER
but you can configured it by providing a second parameter to AddRedisStreamBackgroundReceiver
. You can easily imagine a script able to pull the messages from the dead letter queue and send them again.
Inside a consumer group, you can have multiple consumers. Each consumer group receives a single copy of the message.
You can name the consumer with the third parameter of AddRedisStreamBackgroundReceiver
. You should use an unique name PER INSTANCE
When multiple consumer are subscribed to the same event, or when, there is multiple event pending, they are executed concurrently. This mean that you should not rely of the order they are inserted.
To avoid any concurrency issues, consumers should be registered as Transient. So, if you use Entity Framework, register it as Transient too:
builder.Services.AddDbContext<ApplicationDbContext>(options =>
SqlServerDbContextOptionsExtensions.UseSqlServer(options, dbConn), ServiceLifetime.Transient);
You can add a Redis Hash in a special named key: Cron:Schedule:{Name of your cron}
. This hash should have as a fields (in the order bellow):
CronExpression
A cron expression that can be parsed with Cronos (https://github.com/HangfireIO/Cronos)EventName
The name of the event we want to trigger when the cron expression is hit
Every time the cron expression is hit, an event with the name EventName
is published.
You can configure the redis stream publisher and receiver by calling:
builder.Services.AddRedisStreamOptions(options =>
{
options.JsonSerializerOptions.PropertyNamingPolicy = JsonNamingPolicy.CamelCase;
});
Keyspace events, when enabled, are pushed to a special Redis stream: KEYSPACE_EVENTS
. Allowing consumers to consume keyspace notification.
https://redis.io/docs/manual/keyspace-notifications/
Example use case: debounce a series of events by delaying the expiration of a key. When the key expires, then, execute our action.
WARNINGS: this feature is not perfect because:
- it uses pub sub to subscribe to keyspace events, so if the service is down, some events can be missed
- events can be received multiple time (and pushed multiple time) in case of multiple instance
The feature will move to a Redis Function (redis 7, when it will be available)