Replies: 6 comments 8 replies
-
If I explicitly print the timing information the discrepancy goes away.
So looks like 14.9ms means something else, maybe time it took inside this function? I think we have to use our own observer like printer. |
Beta Was this translation helpful? Give feedback.
-
My understanding about tracing was indeed incorrect. If I try this: #[tracing::instrument(skip_all)]
async fn foo() {
tokio::time::sleep(std::time::Duration::from_secs(3)).await;
}
#[tracing::instrument(skip_all)]
async fn test() -> fastn_core::Result<fastn_core::http::Response> {
tokio::time::sleep(std::time::Duration::from_secs(2)).await;
foo().await;
Ok(actix_web::HttpResponse::Ok().finish())
} This is the output I get.
No idea what these percentages mean. |
Beta Was this translation helpful? Give feedback.
-
The issue was with tracing-forest. I have removed it as a dependency, and borrowed code from them, and added our own spin to it: instead of tracking BODY and ROOT like they did, we track total wall time for any function, we also print the time when a span or event is created with respect to the parent span to show more timings. The key performance issues are because of a. HTTP requests, we have to cache 404 responses, and b. time to parse files, we can cache parse output.
|
Beta Was this translation helpful? Give feedback.
-
I have implemented |
Beta Was this translation helpful? Give feedback.
-
For fn t2() {
let ftd_src = std::fs::read_to_string("doc-site.ftd").unwrap();
let start = std::time::Instant::now();
let doc = ftd::interpreter::ParsedDocument::parse_with_line_number("yo", ftd_src.as_str(), 0)
.unwrap();
println!("time to parse: {:?}", start.elapsed());
std::fs::write("doc-site.json", serde_json::to_string_pretty(&doc).unwrap()).unwrap();
let start = std::time::Instant::now();
generate_hash(ftd_src.as_str());
serde_json::from_str::<ftd::interpreter::ParsedDocument>(
std::fs::read_to_string("doc-site.json").unwrap().as_str(),
)
.unwrap();
println!(
"time to verify hash and read from disc and deserialize: {:?}",
start.elapsed()
);
} Which gives (
So if we cache the JSON in a |
Beta Was this translation helpful? Give feedback.
-
Baffling So right now I am trying to optimise fastn build, and it's really baffling. I am getting output like this:
The relevant code is: for main in documents.values() {
let start = std::time::Instant::now();
print!(
"Processing {}/{} ... ",
config.package.name.as_str(),
main.get_id()
);
if let Ok(()) = handle_file(main, config, file, base_url, ignore_failed, test).await {
fastn_core::utils::print_end(
format!(
"Processed {}/{}",
config.package.name.as_str(),
main.get_id()
)
.as_str(),
start,
);
}
} The reported time elapsed, as measured by It's like my for loop is sleeping for almost 99.9% of the time, it wakes up executes for ~1ms, and then sleeps for ~1s! This is really baffling! |
Beta Was this translation helpful? Give feedback.
-
Status: In progress
Owner: @amitu
As part of our 0.4 release we want to improve our performance story. The js based runtime and the fastn 0.4 versioning proposals are other attempts to improve performance.
What I am struggling with right now:
If I run the server with tracing enabled, and make a single HTTP request using curl (
time curl http://127.0.0.1:8000
), I get the following log:time
also reports:As you see our
route
function (defined infastn/fastn-core/src/commands/serve.rs
Line 637 in 14bf79b
700ms
(still too much, but I have to say "only"), but the end to end time taken is3.2s
.Where is the remaining
(3.2s - 0.7s = ) 2.5s
going? If you see this:fastn/fastn-core/src/commands/serve.rs
Line 755 in 14bf79b
route
is called by actix as entry point for all requests. So how is actin middleware reporting 3.2s but the entry point only taking 0.7s (this number comes fromtracing
macro)?Is this an actix issue? How do I go about debugging it?
So far I am trying to debug this by removing code. If I put a hello world in the
route()
, the middleware andtracing
agree on how long it takes. So something insideroute()
has to be the culprit.Any thoughts @Arpita-Jaiswal, @abhiramiyer, @Heulitig, @harshdoesdev?
Beta Was this translation helpful? Give feedback.
All reactions