snax

explanation of the rails security vulnerability in 1.1.4, others

introduction

second update

Please also see my Anatomy of an Attack Against 1.1.4, which is more detailed and more accurate regarding the mechanics of the flaw.

first update

The problem is not limited to remote execution; there are denial-of-service issues too. See this thread on Ruby-Forum, and the official explanation on the Rails weblog. However, as Chris Carter points out on Peter Krantz’s blog:

“The scary part is that it can run your schema.rb file, which will drop all your tables and recreate them.”

I always use a SQL schema instead of a Ruby schema because I use a visual database modeler, so I was immune to this particular use of the bug. Your first line of defense should be to rename schema.rb to schema.something_else.

A vulnerability tester (not an exploit) has been posted to Pastie. Usage instructions and some tests of different combinations of webservers and Rails versions are here, on the Caboose wiki. The testing script was written by Zed Shaw. Be very careful to check that it won’t accidentally do something dangerous to your app. Warning: don’t run it against production, because it will reload your database schema if your application is vulnerable.

details

The patch location is easily discovered with the elite hacking tool diff -r:

routing.rb 1.1.4
  base[0, extended_root.length] == extended_root || base =~ %r{rails-[\d.]+/builtin}
routing.rb 1.1.5
  def file_kinds(kind)
    ((@file_kinds ||= []) << kind).uniq! || @file_kinds
  end

  file_kinds :app

  base.match(/\A#{Regexp.escape(extended_root)}\/*#{file_kinds(:lib) * '|'}/) || base =~ %r{rails-[\d.]+/builtin}

It looks like, for example, that if your Rails installation is in /www/rails/, passing a string such as /www/rails/../../tmp/ would pass the old validation, and if you had managed to upload a file such as hax_controller.rb to /tmp/, a route request to /hax/ would force Rails to run your arbitrary code. It seemed like the browser-submittable header field $HTTP_LOAD_PATH would get rewritten to $LOAD_PATH during certain requests. This may not be true in production mode, or it may depend on your webserver. Even so, file_column and other plugins upload things to places within the Rails tree (e.g. #{RAILS_ROOT}/public/files/model_name/id/filename), so it wouldn’t have been necessary to backtrack on the path, because /public/files/model_name/id/filename would have been a valid route even though there was no such controller. (I think. I am not an expert on routes.) If you could predict the landing-place of an uploaded .rb file, you could compromise the system.

However, as is mentioned in the update above, the default Rails install tree contains some files which are already dangerous and could be executed without having to manipulate the $LOAD_PATH. This means there is a built-in potential for denials-of-service as well as database corruption if the schema is reset or if migrations are forced to run in the wrong order.

criticism

The security-through-obscurity response of the Rails team has been widely criticized. As a commenter notes on the Rails blog, this is the kind of thing that gets software permanently rejected by organizations where risk management is a high priority. Kristian Köhntopp reminded me that Theo de Raadt tried the same approach with the famous OpenSSH vulnerability. It didn’t work then either. Open-source and limited disclosure policies tend not to go well together.

In other news, I posted release 12 of the has_many_polymorphs plugin.