Skip to main content
Version: v2.0

Native Language Workflows

Since the framework currently supports Node.js, Deno and Bun.js ecosystems, the native languages currently supported are TypeScript and JavaScript. This allows users to create custom functions. A native language workflow enables us to incorporate additional features using JavaScript or TypeScript, where we have the capability to implement intricate business logic.

tip

In Godspeed, your function gets input in a standard JSON format and returns output in a standard JSON format, independent of the eventsource through which this function is triggered. Eventsource could be Express, Fastify, Apollo Graphql or event message bus like Kafka, RabbitMQ or socket message. This means Godspeed has a unified way to deal with all eventsources, giving you modular architecture and re-uasability of your functions.

Example Typescript function

import { GSCloudEvent, GSContext, PlainObject } from "@godspeedsystems/core";
import Pino from 'pino';

export default function (ctx: GSContext, args: any) {
const {
inputs: {
data: {
params, //path parameters from endpoint url
body, // request body in case of http and graphql apis, event data in case of message bus or socket
query, // query parameters from rest api
user, // user payload parsed from jwt token
headers //request headers in case of http and graphql apis
}
},
childLogger, // context specific logger. Read pino childLogger for more information
logger, // Basic logger of the project, generally prefer childLogger for logging
outputs, // outputs of previously executed tasks of yaml workflows (if any)
functions, // all loaded workflows/functions from the src/functions/ folder
datasources, //all configured datasources from src/datasources
mappings //mappings from src/mappings folder. this is useful for loading key value configurations for business logic of your project
}: {
inputs: GSCloudEvent,
childLogger: Pino.Logger, // you can also add custom attributes to childLogger
logger: Pino.Logger,
outputs: PlainObject,
functions: PlainObject,
datasources: PlainObject,
mappings: PlainObject
} = ctx;

// Will print with workflow_name and task_id attributes.
childLogger.info('Server is running healthy');
// Will print without workflow_name and task_id attributes
logger.info('Arguments passed %o', args);
logger.info('Inputs object \n user %o query %o body %o headers %o params %o', user, query, body, headers, params);
logger.info('Outputs object has outputs from previous tasks with given ids %o', Object.keys(outputs));
logger.info('Datasources object has following datasource clients %o', Object.keys(datasources));
logger.info('Total functions found in the project %s', Object.keys(functions).length);

// Returning only data
return 'Its working! ' + body.name;

//SAME AS
return {
data: 'Its working! ' + body.name,
code: 200,
// success: true,
// headers: undefined
}
//SAME AS
return {
data: 'Its working! ' + body.name,
code: 200,
success: true,
headers: undefined // or u can set response headers as key: value pairs,
//for example headers:{custom-header1:"xyz" }
}
}
tip

For seeing how framework handles data returned from a function, including calculation of code, success and data, read this section at the bottom of the page.

GSContext

GSContext carries the loaded components of this project and as well the inputs of the current event.

args

The second parameter of the function call is args. This parameter is useful when this function is called from a YAML workflow in Godspeed. The args passed in the yaml task of the caller YAML workflow is passed as args here. It can be of any native type like object, array, string, number, boolean.

Caller YAML function
  summary: some workflow
tasks:
- id: first_task
fn: some_function
args:
name: mastersilv3r
Callee Typescript function
  export default function (ctx: GSContext, args: PlainObject) {
ctx.logger.info(args.name); //Prints 'mastersilv3r'
}

More about GSContext

note

Every function/workflow has access to the ctx object, which is passed as an argument, and furthermore, you can access its properties by destructuring it.

Check the code of GSContext interface here. GSContext has the contextual information of your current workflow and is available to the event handlers (functions). It is passed to any sub workflows subsequently called by the event handler.

It includes all the context specific information like tracing information, actor, environment, headers, payload etc.

Every information you need to know or store about the event and the workflow executed so far, and as well the loaded functions, datasources, logger, childLogger, config, mappings etc, is available in the GSContext object.

Inputs

Inputs Provide you all the Information you passed to event like headers, params, query, params (path params), body & user (parsed JWT information).

  const {inputs} = ctx;
const body = inputs.data.body;

Outputs

To access outputs of tasks executed before the current task, developer can destruct ctx object just like how inputs and datasources.If we have more then one task, we can access first task outputs in second task with Outputs object. we should access first task output by useing it's id.

  const {outputs} = ctx;
const firstTaskOutput = outputs[firstTaskId]

Accessing Datasource Clients

With datasources we can access all datasources, their clients and methods.


const { datasources} = ctx;
const responseData = await datasources.mongo.client.Restaurant.create({
data: inputs.body
})

ChildLogger

With childLogger you have accessibility to Pino logger instance with context information set - for example the log.attributes set in eventsource or event level.


const { childLogger} = ctx;
childLogger.info('inputs: %o', inputs.body);

Returning from a function

GSStatus

Developers can return pure JSON object in response, or return GSStatus if they use Typescript. The GSStatus is a built-in class in Godspeed. We invoke it when we're prepared to define an API response manually and dispatch it. GSStatus has the below properties.

    success: boolean;
code?: number;
message?: string;
data?: any;
headers?: {
[key: string]: any;
};

We return with GSStatus as below

 return new GSStatus(true, 200, 'OK', responseData, headers);

Different ways to return from a event handler

    // Returning only data
return 'Its working! ' + body.name;

//SAME AS
return {
data: 'Its working! ' + body.name,
code: 200,
message: 'OK', //HTTP protocol message to be returned from service
// success: true,
// headers: undefined
}
//SAME AS
return {
data: 'Its working! ' + body.name,
code: 200,
message: 'OK', //HTTP protocol message to be returned from service
success: true,
// headers: undefined
}
//SAME AS
return {
data: 'Its working! ' + body.name,
code: 200,
message: 'OK', //HTTP protocol message to be returned from service
success: true,
headers: undefined
}
// SAME AS returning GSStatus like this
return new GSStatus(true, 200, 'OK', 'Its working! ' + body.name, headers);
Note

Check event handler response to know how framework handles GSStatus.

Invoking functions and datasource clients from JS/TS functions

  • All functions within a Godspeed project, including those written in YAML, JavaScript (JS), or TypeScript (TS), are accessible through the ctx.functions object.

  • Ofcourse you can also import them in the standard Typescript or Javascript way

  • Similarly, all datasource clients initialised in a Godspeed project are conveniently available under the ctx.datasources object.

    export default async function (ctx: GSContext, args: any) {

    //Calling functions (yaml, js, ts) from within a ts/js function, in a meta framework's project's functions folder, all project functions are available under ctx.functions.
    const res = await ctx.functions['com.gs.helloworld2'](ctx, args);
    //Same As
    const res = await require('com/gs/helloworld2')(ctx, args);
    // Calling datasource functions. All datasources are available under ctx.datasources hood.
    // OPTION 1
    // Every datasource exposes a client key. The client may be a single instance like in case of Axios, or multiple datasource client instances like in case of AWS, or database models like in case of Mongoose (depending on the plugin used).
    const res = await ctx.datasources.aws.client.s3.listBuckets(args);
    // OPTION 2
    // All datasources have an execute method. This may be preferable in case you want to utlise the full capabilities of the plugin wrapped over the native clients, like error handling checks and response codes, retries, caching etc.
    const res = await ctx.datasources.aws.execute(ctx, {
    //Pass exactly same args as this aws service's method takes
    ...args,
    meta: {entityType: 's3', method: 'listBuckets'}
    //Along with args, pass meta object
    // meta can contain {entityType, method}
    });
    if (!res.success) {
    return new GSStatus(false, res.code || 500, undefined, {message: "Internal Server Error", info: res.message})
    }
    //If a developer only returns data without setting keys like "success" or "code" in the response,
    // the framework assumes it is just the data.
    //In such cases, the response code defaults to 200, and success is assumed to be true.

    return res
    // works same as return new GSStatus(true, 200, undefined, res );
    }

Handling event handler return

By default, all the framework defined functions or developer written functions, have to return either GSStatus or data.
Now lets see how the framework qualifies your return as GSStatus or simple data. The framework sees that your returned data has one of code or success meta-keys.

Non Authz (normal) workflows
i. If both are present, it looks for the other GSStatus keys and set them.
ii. If only code is present and lies between 200-399, then success is assumed to be true else false. It looks for the other GSStatus keys and set them.
iii. If only success is present, then code is assumed to be 200. It looks for the other GSStatus keys and set them.
iv. If it doesn't find any of these keys, it assumes all that you have returned is intended to be GSStatus.data then it adds code: 200 and success: true internally to your response and create a GSStatus out of it to pass on to next tasks or workflows.

Authz workflows
Check reponse handling in case of authorization workflows.

info

You can study the code here to understand both of the above scenarios better.