I agree with you, indeed we modified the robots.txt edit.
The problem is mainly due to the rule 'Disallow: Googlebot', even if the access is given only to '.js' and '.css' resources the generic block is skipped by Google.
It's better to avoid it and simply remove the "Disallow: /media/". This is our conclusion after several tests.
Everywhere seems that the correct and easy way is adding the lines:
User-Agent: Googlebot
Allow: /*.js*
Allow: /*.css*
http://upcity.com/blog/how-to-fix-googlebot-cannot-access-css-and-js-files-error-in-google-search-console/
But as you pointed out this solution seems not fully correct.
Thanks for your help.