Computer hackers have taken advantage of the acceptance of these packers as suboptimal network optimization tactics and are using them as a way to evade and bypass security controls on the gateway and at the host. Consequently, exploits or other malicious code is delivered successfully because of the packerâ€™s ability to bypass anti-virus and IDS/IPS and directly to a user’s vulnerable system.
While I am personally a bit confused by the lack of examples and detail information about the exploit procedure the main crux of the matter seems to be that a lot of packers use eval() and we know for a long time that eval is evil.
The article then goes over the different packers in use and lists their problems:
- The inability to easily verify and audit code
- The administrative overhead of repacking code for each change
- Suboptimal compression
- The increased risk of false negatives which may lead to a site being used to spread malicious code
- The increased risk of false positives, which may lead to a site or some of its functions being blocked by security controls
- Noticeable negative impact on client-side performance.
It ends with a list of recommendations what to do instead of relying on packers on the final product:
- Reliance on increases in average available bandwidth
- Reliance on local and network caching
- Using only safe whitespace/comment reduction techniques
- Automatic application of safe techniques as a last step in the publishing process
- The use of mod_deflate/mod_gzip for compressing the HTTP response data
I’ve yet to see the last option in the wild but agree with gzipping and application of compressing in the publishing process rather than using it in the final product. Personally I don’t quite think that relying on increase of average available bandwidth is a safe option though.
What do you do, or – even more interesting – have you encountered security problems by using packers?
Full article: SecureWorks – The Packer 2.0 threat