commit fails for big files
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
Bazaar |
New
|
Undecided
|
Unassigned |
Bug Description
if I commit large files bazaar buffers the data for performance reasons to the system memory.
If this exceeds 2GB memory it does not work.
for standard sourcecode applcations this should worl fine but for bigger files (even videos) there should be some streaming features...
Committing to: E:/temp/bazaar/
added robots
added robots/
added robots/
aborting commit write group: MemoryError()
bzr: ERROR: exceptions.
Traceback (most recent call last):
File "bzrlib\
File "bzrlib\
File "bzrlib\
File "C:/Programme/
File "bzrlib\
File "bzrlib\
File "bzrlib\
File "bzrlib\
File "bzrlib\
File "bzrlib\
File "bzrlib\
File "bzrlib\
File "bzrlib\
File "bzrlib\
File "bzrlib\
File "bzrlib\
File "bzrlib\
File "bzrlib\knit.pyo", line 913, in add_lines
File "bzrlib\knit.pyo", line 923, in _add
MemoryError
bzr 1.15 on python 2.5.2 (win32)
arguments: ['filenames removed"']
encoding: 'cp1252', fsenc: 'mbcs', lang: None
plugins:
bzrtools C:\Programme\
launchpad C:\Programme\
netrc_
qbzr C:\Programme\
svn C:\Programme\
*** Bazaar has encountered an internal error.
Please report a bug at https:/
including this traceback, and a description of what you
were doing when the error occurred.