What is a good duration between two garbage collections on production JVMs? -
i trying figure out duration between 2 garbage collections on jvm 8 in production.
i can tune memory available on jvm , side effect increase duration between 2 garbage collections how distinguish between normal situation or 1 not have enough memory allocated machine.
this problem applies systems jira , confluence, can have @ screenshot. garbage collection take places every 3 hours.
/usr/lib/jvm/java-8-oracle/bin/java -djava.util.logging.manager=org.apache.juli. classloaderlogmanager -xms15000m -xmx15000m -xx:+printgcdatestamps -xx:-omitstacktraceinfastthrow ... org.apache.catalina.startup.bootstrap start
gcs running in not indicator of anything. minor gcs can run several times per second on workloads without signifying bad. , concurrent (cms)/mixed (g1) gc phases running once minute or can normal on workloads.
two better measures following:
- how heap freed old gen gc. if utilization barely goes down @ either extremely tuned application or you're approaching oome either due insufficient heap space or memory leak in application.
- how many cpu cycles spent in gc vs. cpu cycles in application time - sort of follows first point. if gc can collect little garbage have run again garbage keeps piling up.
this triggersgc overhead limit exceeded
oome in first place, jvm keeps track of how time spends in gcing , throws exception when exceeds default limit.
there exceptions point two, example single-threaded applications use concurrent collector burn cpu cycles gcing on core achieve higher throughput.
for web application can't judge things based on gc intervals because vary user demand , in-memory caches/data stores/databases being warmed users start utilizing system each day. depends on how caches freed/scrubbed of stale entries.
you have read documentation of application kind of in-memory stores uses , how can tuned / limited.
running realistic load test emulates multiple users , crawls many pages of application might give baseline how memory it'll need.
Comments
Post a Comment