Urs Riggenbach Webdesign, Consulting, Renewable Energies

My Time

Open Access to Renewable Energies

Since 2012 I've been working with GoSol.org toward initiating a global wave of solar enterpreneurship.

Practical Tools for Change

Building web-platforms and systems with purpose that connect actors, empower people and facilitate change.


Curriculum Vitae

I am a Web-Designer, Sysadmin and Renewable Energy specialist. I'm a UWC and COA graduate, Farmer, Human Ecologist, Open Source Enthusiast, Techno Peasant, Biker, Longboarder, Scout, Hiker, Junglist and Salsa dancer.

My mantra is Global Collaboration - Local Production.

I bring an international Baccalaureate from the United World College, India and BA in Human Ecology from the College of Atlantic, US.

Chief Operating Officer

June 2012 - now
Launch of innovative platform for the spread of solar thermal energy solutions.

Solar Fire Concentration Ltd, Finland
Chief Operating Officer

June 2012 - now
Solar thermal technology solutions in the humanitarian and industrial sector.
- Technological development, project management, IT consulting, web and communication.

Autodesk Inc, San Francisco
Pier 9 Impact Residency

February 2017 - May 2017
- Industrial CNC machine training (Waterjet, Lasercuting, 5 axis CNC)
- Rapid prototyping using state of the art CNC machinery

Oekozentrum Langenbruck, Switzerland
Swiss Civil Service

August 2013 - February 2014
- Support in research and development.

WWF Switzerland
Swiss Civil Service

February 2013 - July 2013
- Development of exhibition on renewable energies.

Sustainable Design/Build, Yestermorrow, USA
Semester course in sustainable design and building practices

- Instructors from the fields of architecture, construction and joinery/Carpentry
- Application of principles of sustainability and sustainable design in the architecture of a "tiny house" of 227 square feet.
- Project planning and management with different build milestones.
- Construction of entire tiny house, see it in New York Post "Tiny House 227".
- Study and implementation of HVAC systems.

College of the Atlantic, USA
Bachelor of Arts in Human Ecology

September 2008 - June 2012
- Relevant Coursework: Agroecology, Economic Development, International Water Resource Management, Physics II, Collaborative Leadership, Fieldwork: Seminar in Community-based Research, Documentary Film Making, Webdesign, Fixing Food Systems, Sustainability, Local Production - Global Collaboration.
- Senior project in Nepal installing renewable energy framework at rural school
- Spanish proficiency during project-stay in Yucatán, Mexico
- Davis UWC Scholar: full scholarship awarded

United World College, India
International Baccalaureate, IB

September 2006 - May 2008
- International Baccelaurate (IB) with major biology and economics.
- Course language English.
- Extended essay: Sugarcane Cultivation in the Mulshi Valley, India.
- Full Scholarship from the Swiss Association for UWC




Decrypting Formatted LUKS Partitions

Posted Friday 14 February 2020 by Urs Riggenbach.

Linux allows for block level filesystem encryption, via LUKS and the cryptsetup utility. When installing Linux, disk encryption is a recommended option as it ups your data security and protection. When encrypting external drives, the drives are unreadable on Mac and Windows computers, which will then ask you if you want to format the drives. If you’ve formatted a drive by accident, do not panic, just make sure you don’t write any new data to the drive and use below steps to get your data back.

1. Search hard-drive for LUKS (missing) partition.
Substitute sdc with your hard-drive, use for example gnome-disks to identify the hard-drive path):

hexdump -C /dev/{sdc} | grep LUKS

This will output something like:

hexdump -C /dev/{sdc} | grep LUKS
2e3b5040  65 73 73 20 64 65 6e 69  65 64 00 4c 55 4b 53 ba  |ess denied.LUKS.|
{{2f500000}}  4c 55 4b 53 ba be 00 01  61 65 73 00 00 00 00 00  |LUKS....aes.....|

→ If you have multiple encrypted partitions on the drive, you will get more outputs. If you just have 1 partition, you can cancel the command once you have reached the first outputs.

2. Loopmount the found partition.
Add "0x" to the location descriptor (for example: 2f500000) outputted by GREP in previous step.

losetup -o 0x{{2f500000}} -r -f /dev/{sdc}

3. Decrypt the found partition.
With the following command, it will be mounted at /dev/mapper/decrypted_partition. You will be asked for your password.

cryptsetup luksOpen /dev/loop0 decrypted_partition

4. Access the decrypted partition
For regular partitions, such as ext4, btrfs, etc, you should now see the partition in your favorite file browser, or using gnome-disks software.

If the partition contains an LVM, run:

vgchange -ay


And then check your file browser or gnome-disks software for your hard-drive. De-panic and backup your data to another disk.

Optimizing Images for Google PageSpeed with PHP and NGINX

Posted Monday 16 July 2018 by Urs Riggenbach.

One of the most critical factors in getting a high Google PageSpeed ranking is optimized images. Penalty is given for any image that is not compressed. This blog post will show you a framework-independent way to optimize your images with PHP and NGINX.

SPIP, Drupal, WordPress and the like store and serve uncompressed files usually from a single folder. In SPIP it is "local", in Wordpress it would be "wp-uploads". Instead of directly optimizing the source images in these folders, I’ve developed a PHP script that copies the files to a separate folder mimicking the same folder structures and filenames, and then optimizing the files. The specific software tools used for optimization are jpegtran and optipng.


Optimizing the Images
Deploy the following script in the root directory of your PHP framework.

//Optimizes images for delivery over web
function copyfile($in, $out,$outfolder) {
        //check fi fiel is in destiation
        if (file_exists($out)) {
                $return = "already_processed";
        else {
                exec('mkdir -p ' . $outfolder);
                exec('cp ' . $in . ' ' . $out);
                exec('chmod 777 ' . $out);
                echo "copied file: $in to $out";
                $return = "not_processed";
        if (isset($_GET['all'])) {
        $return = "not_processed";
        return $return;
$rii = new RecursiveIteratorIterator(new RecursiveDirectoryIterator('/var/www/html/local'));
$files = array();
foreach ($rii as $file) {
        if ($file->isDir()){
        $files[] = $file->getPathname();
        foreach ($files as $value) {
        $file_input = $value;
        //regex to replace last folders
        $re = '/(\/var\/www\/html\/local)/';
        $subst = '/var/www/html/local_optimized';
        $file_output = preg_replace($re, $subst, $value);
        //regex to get folder name of file
        $file_output_pathinfo = pathinfo($file_output);
        $file_output_folder = $file_output_pathinfo['dirname'];
        echo $file_input . "\n";
        echo $file_output . "\n";
        echo $file_output_folder . "\n";
        if (exif_imagetype($value) == IMAGETYPE_PNG) {
                echo 'The picture is a PNG...
                $processing_status = copyfile($file_input,$file_output,$file_output_folder);
                if($processing_status == "already_processed"){
                        echo "already processed... nothing to do. \n \n";
                } else {
                        echo "processing now... \n \n";
                        $output = exec('optipng -o5 '.$file_output);
                        echo $output;
        if (exif_imagetype($value) == IMAGETYPE_JPEG) {
                echo 'The picture is a JPG...
                $processing_status = copyfile($file_input,$file_output,$file_output_folder);
                if($processing_status == "already_processed"){
                        echo "already processed...\n\n";
                        echo "processing now...\n\n";
                        $output = exec("jpegoptim  --verbose --max=80 --strip-all --preserve --totals " . $file_output);
                        echo $output;


You can run the script directly from the command line, such as php filename.php or access the file over the internet via https://yourwebsite/filename.php (however, running over CLI and automating this with a cron-job is better as there is no PHP max execution time limit on the CLI, and optimizing images is a resource heavy process).


Configuring NGINX
Next we need to configure NGINX to serve images from the "local_optimized" folder as opposed to the "local" folder. Because the above script will run as Cron Job periodically, we want to fall back to the "local" folder when the iamge cannot be found in the "local_optimized" folder (yet).

In your NGINX website conf-file, make sure you add a "location" block for your main website ("/"), and then serve the "local_optimized" folder primarily and then the original "local" folder as backup:

#your main location block
     location / {
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header Host $host;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;

#your optimized image block
location /local/ {
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header Host $host;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
       proxy_pass ;
        proxy_intercept_errors on;
        recursive_error_pages on;
        error_page 404 = @static_image_https;
#your optimized image block fallback
location @static_image_https {
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header Host $host;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;

Don’t forget to nginx -t and service nginx reload


Validate your Setup
As mentioned in the beginning, Google PageSpeed is a great tool to validate that your images are served compressed. To validate the script working, you can simply compare the filesizes of the images in the "local" folder versus the "local_optimized" folder.

1 2 3 4 5 6 7

Contact Me

Send me an email to mail@ursrig.com