You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Local executors can get the test report via the Serial interface.
Remote test executors don't have this option. At present there is no easy way to pass a large amount of data via the cloud. Here are the options:
use Spark.publish(): maximum event size is 64 bytes and maximum of 1 per second, 4 in a burst.
open a TCP server: this could work, but need to find out our public IP address for the remote client to connect to. There are services on the internet that can provide this. But too much effort!
use spark variable: copy say) 600 bytes form the log to a variable. Use a function to advance the pointer (and variable content) to the next 600 bytes. Function returns -1 when no more content (which then wraps around and starts from the beginning). This is fairly easy to code, test and use.
write to github directly: too tightly coupled to a specific use-case.
I'll implement using a variable and a function to advance the start point. Interaction like this:
call update function - this fetches the N bytes from the circular buffer, writes them to the variable and returns N. This advances the pointer in the circular buffer.
if circular buffer has 0 characters available, check test status flag to see if test is complete - when complete return -1, else return 0. This allows client code to continue retrying when the log is still open.
Although a little hacky, this allows relatively small streams (<1MB) to be pushed through the cloud relatively quickly.
The text was updated successfully, but these errors were encountered:
Local executors can get the test report via the Serial interface.
Remote test executors don't have this option. At present there is no easy way to pass a large amount of data via the cloud. Here are the options:
Spark.publish()
: maximum event size is 64 bytes and maximum of 1 per second, 4 in a burst.I'll implement using a variable and a function to advance the start point. Interaction like this:
Although a little hacky, this allows relatively small streams (<1MB) to be pushed through the cloud relatively quickly.
The text was updated successfully, but these errors were encountered: