mirror of
https://github.com/mozilla/gecko-dev.git
synced 2024-11-02 07:05:24 +00:00
0d458e4c2f
not crash added transparency idea to TODO
79 lines
1.3 KiB
Plaintext
79 lines
1.3 KiB
Plaintext
httpd:
|
|
|
|
implement proxy inside httpd so that client can pass cookies to us and
|
|
we can then do a more faithful (i.e. transparent) HTTP view
|
|
|
|
perhaps have the proxy do some viewing too, like the "proxy" tool but
|
|
with better presentation e.g. lists of links to all the sessions
|
|
|
|
css:
|
|
|
|
parse css for urls, etc
|
|
|
|
http:
|
|
|
|
support accept-encoding
|
|
|
|
https:
|
|
|
|
implement it
|
|
|
|
view:
|
|
|
|
use style sheets instead of html tags and attributes for color, etc
|
|
|
|
option to view glyphs instead of char codes?
|
|
|
|
html:
|
|
|
|
meta refresh url
|
|
|
|
detect ucs-2, ucs-4
|
|
|
|
more checking in iso 2022 code
|
|
|
|
uri:
|
|
|
|
support i18n
|
|
|
|
mime:
|
|
|
|
deal with content type "text/html "
|
|
|
|
deal with multiple charset parameters in one content-type
|
|
|
|
robot:
|
|
|
|
take stats on domain names e.g. foo.co.kr, www.bar.com
|
|
|
|
url char stats e.g. 8-bit, escaped 8-bit, etc
|
|
|
|
hierachical tag and attribute stats, not flat attr space
|
|
|
|
nntp robot
|
|
|
|
ftp robot
|
|
|
|
dns robot
|
|
|
|
ip robot
|
|
|
|
randomize urls?
|
|
|
|
hash:
|
|
|
|
improve hashing (grow tables, prime numbers)
|
|
|
|
use less memory in url hash table (value not needed, only key needed)
|
|
|
|
general:
|
|
|
|
find memory leaks
|
|
|
|
use less memory in url list (use array, remove processed urls)
|
|
|
|
use nm to find all system calls, and do proper error checking on all of
|
|
them e.g. write() to catch sigpipe-like stuff(?)
|
|
|
|
remove exit() calls from underlying code
|