lpnet app servers are leaking memory again
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
Launchpad itself |
Fix Released
|
High
|
Gary Poster |
Bug Description
A non exhaustive analysis suggests they're gaining around 200-250Mb per day.
But we also see "events" that cause sudden memory gains. eg:
USER PID PPID NI PRI TIME %MEM RSS SZ VSZ STAT BLOCKED NLWP STARTED ELAPSED CMD
2010-02-
1000 20262 1 0 19 19:41:12 11.3 692776 794852 1049884 Sl 0000000000000000 6 Tue Feb 23 13:33:55 2010 2-00:21:28 /usr/bin/python2.5 bin/run -i lpnet12
2010-02-
1000 20262 1 0 19 19:41:41 13.3 813460 925924 1180956 Sl 0000000000000000 6 Tue Feb 23 13:33:55 2010 2-00:22:07 /usr/bin/python2.5 bin/run -i lpnet12
ie, a view of lpnet12 on wampee, one snapshot @ 2010-02-
The end result is that the processes eventually head into swap land and need restarting.
Related branches
- Данило Шеган (community): Approve
-
Diff: 33 lines (+5/-4)1 file modifiedlib/lp/translations/browser/translationmessage.py (+5/-4)
Changed in launchpad-foundations: | |
importance: | Undecided → High |
status: | New → Triaged |
milestone: | none → 10.03 |
assignee: | nobody → Stuart Bishop (stub) |
Changed in launchpad-foundations: | |
assignee: | Stuart Bishop (stub) → Guilherme Salgado (salgado) |
Changed in launchpad-foundations: | |
assignee: | Guilherme Salgado (salgado) → nobody |
Changed in launchpad-foundations: | |
milestone: | 10.03 → 10.04 |
Changed in launchpad-foundations: | |
milestone: | 10.04 → 10.05 |
Changed in launchpad-foundations: | |
status: | Triaged → Fix Committed |
assignee: | nobody → Gary Poster (gary) |
tags: |
added: qa-untestable removed: qa-needstesting |
tags: | added: canonical-losa-lp |
Changed in launchpad-foundations: | |
status: | Fix Committed → Fix Released |
I checked the logs (https:/ /devpad. canonical. com/~salgado/ leak.txt) from the period when we saw the spike in memory usage and created this spreadsheet to be able to easily sort them: https:/ /devpad. canonical. com/~salgado/ leak.ods
We were hoping to find some suspiciously- looking requests there, and there are a few indeed, but when a tried hitting (a freshly restarted) staging with them, we didn't see any increase in memory usage: https:/ /pastebin. canonical. com/28709/