spark icon

spark -----

spark is a performance profiler for Minecraft clients, servers and proxies.




Hello!

A number of miscellaneous fixes this time, including importantly a fix for running spark on Java 22+ JVMs.
  • Include server brand in viewer payloads
  • Ignore trailing sleeping samples for duration estimation
  • Use WeakReference for command senders
  • Refactor ClassFinder into interface
  • Add JVM information to viewer metadata
  • Redact username from vm args
  • Add placeholder resolver API
  • Include gamerule settings in metadata
  • Fix gamerule read on older Bukkit versions
  • Check for async command completion before sleeping
  • Cleanup spark tmp dir on startup
  • Add uploadable health report
  • Refactor sampler node export process
  • Include engine type in sampler proto
  • Include datapacks info in sampler proto
  • Include memory stats in ws updates
  • Merge service files to fix unrelocated adventure module name leaking
  • Improve error logging in various places
  • Support --ignore-sleeping with async-profiler
  • Upgrade async-profiler to v3
Enjoy!
----------, Dec 1, 2024

Hello!

One new feature this time and a number of other misc fixes :)

New feature: Info Points!

[​IMG]
  • "Info Points" are little snippets of information that will display alongside well-known/notable call frames in the spark viewer when you hover over the ⓘ symbol.
  • The idea is that these should/could help less experienced or technical users understand what different parts of the call stack relate to in real terms (especially Minecraft server/client internals).
  • The descriptions are open source and can be easily edited/changed/improved by the community. If you notice anything that looks incorrect or think there is a thread/method call that deserves an extra description, please consider contributing it!
More info about the feature: https://spark.lucko.me/docs/Using-the-viewer#info-points

More info about contributing new info points:

And some other fixes included in this release since the last one:
  • Fix NPE in ClassFinder
  • Don't use multi-release jar for websocket code
  • Close worker thread pool on profiler stop
  • Tidy up thread factories and async sampler regex thread filter
  • Allow platforms to provide their own TPS/MSPT calculations
  • Fix only-ticks-over rejected execution error
  • Fixed typo in Configuration.getInteger
  • Include pufferfish config in viewer
  • Add support for Paper Plugins
  • Fix parsing null async profiler segments
  • Use a different websocket library for live viewers
  • Clarify command timeout warning msg
  • Don't try to send statistics to a closed socket
  • Add rcon ip to hidden config paths
  • Bump adventure-platform to 4.3.3
  • Add note to /spark profiler open about expiry time
  • Bump common dependency versions

Enjoy :)
----------, Jun 9, 2024

Hello! Two new features for you this time!

Live Viewer Updates
[​IMG]

  • spark can now continuously stream profiling data to the viewer while the profiler is running.
  • To use this new feature, just run /spark profiler open after starting a profiler. It also works for the background profiler :)
  • The profiling data will refresh every minute, and the statistics (shown in the widgets at the top of the page) refresh every 10 seconds.
  • The 'rings' in the top right corner of the viewer indicate when the next data update is expected.

(Memory) Allocation Profiling

In summary, as well as profiling how long a method took to execute (this is what the current profiler does), spark can now also record how much memory each method allocates.
  • This is based on the allocation profiling feature in async-profiler, so you can read more about the technical details here: https://github.com/jvm-profiling-tools/async-profiler#allocation-profiling
  • In summary, like the CPU profiler, it still uses sampling not instrumentation, so the performance impact should be minimal.
  • The spark viewer UI has been updated to support the changes, including showing bytes instead of time when appropriate.
  • The output can be shown as a percentage of the total (like the existing profiler) or as bytes allocated per second.
  • The command is /spark profiler start --alloc
There are also a few other misc changes in this release. You can see the full changelog here: https://spark.lucko.me/changelog

Cheers :)
----------, Feb 5, 2023

Hello!

The main feature introduced in this update is: Background/Continuous Profiling!
[​IMG]
  • spark will now automagically start a lightweight profiler in the background when it enables.
  • In simple terms, this means that at any time, you can run /spark profiler upload and get the last 1 hour of profiling data to look at!
  • By default, the background sampling interval is 10ms which should be low enough to be unnoticeable on most hardware!
  • It's configurable - you can turn the feature off or configure the interval in spark's config file.
More detailed info can be found on GitHub: https://github.com/lucko/spark/pull/265

The feature has been in beta for a little while, and I think most of the issues have now been ironed out, but please let me know if you run into any problems!


Plus some bonus fixes/changes:
  • Tidy up command feedback messages
  • Remove recusion in protobuf data to avoid StackOverflowException
  • Limit profile length to 1 hour
  • Suppress profiler logs when running in background
  • Remove recursive calls in class source visitor
  • Fix some NullPointerExceptions
  • Include player/entity/chunk counts in window statistics
  • Upgrade to async-profiler 2.9
  • Add a temporary workaround for async-profiler JVM crash issues
----------, Dec 10, 2022

Hello!

The main new feature in this release is: the "Refine" Graph!
[​IMG]
  • The graph is shown at the top of the viewer, and allows for the profiler output to be "refined" to a certain period of time.
  • Plotted on the graph are key performance metrics, meaning users can drill into specific parts of a profile that might be causing problems.
  • If you have used aikar's timings v2 before you might be familiar with this feature, as it fulfils the same purpose as the timings graph.
  • There are some demo videos you can watch here: https://github.com/lucko/spark/pull/253
This release also includes the following other changes:
  • Support linux x64 musl (commonly used in Docker images)
  • Additional server config exclusions
  • Show the CPU model name for Windows
  • Update async-profiler to fix some issues/bugs
Enjoy :)
----------, Oct 15, 2022

Hello! Two new changes for you this time. :)

Firstly, the spark viewer now includes infomation about Worlds ( loaded entities and chunks)!

[​IMG]

And secondly, there is now an option to show average time taken per tick instead of time taken as a percentage (previous behaviour).

[​IMG]
Remember that each tick should be <=50ms to maintain 20 TPS, so if the Server thread (or any nested frame for that matter) is exceeding that value, then your server performance will be noticable to players!

And some other small things:
  • Better handling of Paper's new split config system
  • Full support for text components in 1.19 (clickable/hoverable messages were broken)
  • Filtered some additional sensitive config settings from the profile upload
Enjoy! :)
----------, Jun 28, 2022

Hello! Here is the latest round of spark changes:
  • spark can now show/calculate network usage statistics on Linux systems (included in the output of /spark health)
  • Some additional metrics are now included in the viewer: how long the profiler was running for, and the average player count
  • The memory usage statistic is now more accurate on Linux systems
  • Some extra fields are excluded from the attached "server configs" section in the viewer: specifically the world/feature seeds
  • The async-profiler engine has been upgraded to the latest version
  • Some bugs have (hopefully) been resolved that were sometimes causing issues when stopping the profiler
Enjoy :)
----------, May 22, 2022

Hello!

Another round of changes for you all :) In no particular order...

Profile Flat View
The spark viewer now as a new mode - flat view.

[​IMG]

This view shows a flattened representation of the profile, where the slowest 250 method calls are listed at the top level. It can be sorted according to total time or self time, and displayed either top down (like the normal view) or bottom up (where expanding a node reveals the methods that called it).

Server configuration settings now shown in the viewer info section
Like the rest of the data in the info section, knowing the server configuration settings (especially the ones that relate to performance optimizations) will hopefully make it easier to understand the context of a given profile.

[​IMG]

Sensitive settings (e.g. database passwords or forwarding secrets) are excluded - like they are in Timings reports.

Thanks to omega24 who pull requested some of the functionality for this change.

New Command: /spark ping
Hopefully pretty self explainatory! This command monitors the ping/latency of online players and stores a rolling average. You can also use /spark ping --player <name> to check the ping of a specific player.

Enjoy, see you next time.. :)
----------, Jan 30, 2022

Hello!

This update features two major changes, firstly:

The profile viewer now has "widgets" that show extra contextual information about the server

[​IMG]
(don't worry you can expand/hide it!)

This is only supported once you've updated to this version of the spark plugin though!

I hope that the extra information will make it easier to see an overview of these "server vitals" statistics, and make it possible for others to look at profiles taking into account the context which server server was running in. I plan to continue adding more of this sort of stuff to spark, so more updates to follow soon with (hopefully) similarly useful things!

And secondly..

A fix has been implemented for supporting the "async-profiler mode" inside Docker/Pterodactyl containers

Previously, some extra steps were needed that unfortunately most users weren't able to perform (due to lack of access and/or complexity of the changes required). This meant that spark was using the fallback built-in Java engine instead, whereas actually it could have still been using the async-profiler engine, just without support for profiling native calls (not useful for 99.99% of people anyway!)

In summary..

I encourage everyone to update! :)

If you have suggestions for how spark can be made even better, I'd love to hear them! Or if you can write code, submit a PR! spark is a cool project to work on if you fancy contributing something. Feel free to get in touch on Discord if you want to chat.

Finally, all that's left is to wish you an (early) Happy New Year, and I hope you've had an enjoyable Christmas if you celebrated it. :)
----------, Dec 29, 2021

Hello!

This release adds specific support for Minecraft 1.18 (clickable messages weren't working and required a dependency update to fix).

The format of the data sent to the web viewer was also tweaked slightly to allow for improvements to be made in the future.

Enjoy. :)
----------, Dec 2, 2021

Hi all, this update contains some minor fixes and improvements:

  • Support the --timeout argument for async-profiler sampling mode
  • Update adventure to a newer version
  • Added an (optional) basic configuration file, initially allowing configuration of the bytebin/viewer URLs
  • Catch UnsatisfiedLinkError which sometimes occurred when loading the async-profiler native lib
  • Add a command flag to save profile/heapsummary to a local file instead of uploading to the viewer
  • Some other misc tidying up & light refactoring
Cheers,
~L
----------, Oct 8, 2021

Hi all, I hope you are well. I have a new update for you!

Per-plugin profiler view
This version implements a new feature: "sources" view - which allows the viewer to display profiling output broken down by plugin. I hope that this will make reading/interpreting profiles (especially when the aim is to find laggy plugins) much easier. :)

Here's an example:
[​IMG]

Let me know what you think! I'm open to suggestions (or PRs!!) for how this functionality could be improved further.

Until next time... :)
----------, Jun 1, 2021

Hello everyone, update time!

If you like spark, please consider leaving a review or giving it a star on GitHub!

On with the changes...

Added support for async-profiler
  • async-profiler has been integrated into spark as a new profiler engine.
  • It is currently supported for Linux x86_64 servers only, the existing Java (WarmRoast) profiler will continue to be maintained for other systems and modes (like --only-ticks-over).
  • It's much more accurate and has a lower profiling overhead than the existing engine - win win!

Added permissions for sub-commands instead of just requiring 'spark'
  • Sorry it took so long.. I of all people should know better!

Website/viewer changes and improvements
  • Deployed a new documentation site
  • Lots of style changes/improvements, added a new(ish) logo
  • Deobfuscation mappings are now applied automatically
  • Re-added the search bar - finally!
  • Re-added highlight/bookmarks, these are now encoded in the URL so you can share specific points in a profile with others easily

Fixed some bugs
  • The main one was "fix a bug upon early server startup in which percentiles would throw an out of bounds exception" - thanks to astei for that fix!

That's all I got for ya, until next time... :)
----------, Mar 13, 2021

Hey everyone

The update includes the following changes to the spark plugin from the past few months:
  • Allow exact tick duration to be used as threshold in tickmonitoring command
  • Add '/spark gc' and '/spark gcmonitor' commands to help track GC activity
  • Improve the --ignore-sleeping flag, add --ignore-native flag
  • Add 95th percentile MSPT and replace avg MSPT with median
  • Include platform info in sampler and heap summary data

More significantly, you will notice that in the last 24hrs the web viewer has received a big upgrade. It's now significantly more responsive and loads in a fraction of the time it took previously. Many thanks to @Tux for their help implementing these improvements.

Enjoy :)
----------, Dec 14, 2020

Hello - some updates for you.

A number of improvements to the profiler viewer
  • Theme changes, should be much easier to read now!
  • Added right-click to "bookmark" a method call in the stack
  • Added title to show who created, time of creation, etc information
  • Improved the deobfuscation remapping algorithm
  • Uploaded profiles will now expire after 1 month instead of 1 week (pending - I'll change the config on this later today)

Include method descriptions in the data sent to the viewer to enable better application of deobfuscation mappings.
  • With this change, the spark viewer is able to provide deobfuscation mappings for pretty much all unmapped methods!

Treat different methods (not just methods with the same name) as different stack nodes
  • This is mostly an artifact of the change above, but is still a noteworthy improvement.

Allow comments to be specified on profiler output
  • This comment will show up at the top of the viewer.
  • Should make it easier to organise lots of profiler tabs!
  • i.e. /spark sampler --stop --comment my survival server lag


That's it for now. Enjoy!
----------, Apr 2, 2020

  • Added --order-by-time option to sampler
  • Implement alternative means of tick counting for Paper servers
  • Monitor average tick durations (where possible) and report in /spark tps and /tps
  • Other misc improvements & cleanup
----------, Feb 6, 2020

  • Make heap dump parsing more resilient to bad formatting
  • Send tick monitoring messages async
  • Add support for heap dump compression
  • Use protobuf to encode data sent to the web viewer instead of JSON
  • Add option to ignore "sleeping" threads when profling
  • Implement pagination in /spark activity
  • Fix various issues with --regex flag matching
  • Add support for PlaceholderAPI and MVdWPlaceholderAPI
  • Make CPU monitoring thread a daemon thread
----------, Nov 26, 2019

  • Add a /spark tps command, for more accurate monitoring of the servers tick rate
  • Add /spark healthreport command, to view general information about the servers status
  • Add "activity log" feature for keeping track of past samples/memory dumps
  • Add support for generating heap dumps on non-hotspot JVMs (OpenJ9)
  • Add "combine-all" thread grouping argument
  • Allow sampling at fractions of a millisecond intervals
  • Improve the performance/efficiency of the sampler
  • Ensure that the plugin cleans up completely when disabled
----------, May 14, 2019

  • Add a command (/spark heapdump) for generating hprof memory snapshots
  • Allow thread names to be specified using regex
  • Use fragment identifier instead of query parameters for web viewer (fixes issue with Multicraft consoles)
  • Add --without-gc flag to disable GC notifications during monitoring
  • Add --include-line-numbers flag to record the line number of method calls during sampling
  • Improve the type descriptor conversion in heap dump outputs
  • Update okhttp library version
  • Count ticks using a normal Java int instead of a LongAdder
  • Improve ThreadGrouper "by pool" regex expression
----------, Jan 1, 2019

  • Improved sampler efficiency
  • Implement GC notifications as part of the monitoring command
  • Add /spark heap for basic heap dump (memory) analysis
  • Implement tab completion for commands
----------, Oct 19, 2018

  • Added a /spark alias
  • Added a max stack depth limit to fix issues with rendering
  • Allow multiple threads to be specified in the same command. e.g. /profiler start --thread Thread1 --thread Thread2
  • Optimize the way data is collected and processed
  • Add an option to group all thread pool processes under the same node in the viewer. This is enabled by default, use --not-combined to disable it.
  • Add ‘/profiler monitoring’ command to monitor tick times, and ‘–only-ticks-over’ argument to filter profiling output to ticks lasting over a certain duration
  • Improved the way data is serialized
  • Changed the default sampling interval from 10 to 4ms
----------, Jun 5, 2018

Resource Information
Author:
----------
Total Downloads: 198,359
First Release: May 28, 2018
Last Update: Dec 1, 2024
Category: ---------------
All-Time Rating:
103 ratings
Find more info at spark.lucko.me...
Version -----
Released: --------------------
Downloads: ------
Version Rating:
----------------------
-- ratings