June 2012 - now
Solar thermal technology solutions in the humanitarian and industrial sector.
– Technological development, project management, IT consulting, web and communication.
This site shares about my life and work. Most of what I do is devoted to practical solutions to avoid a global climate crisis. Acting from a point of compassion for all living things (or at least trying to!), I'm always ready to be challenged on how to maximize impact. Passionate about developing "whatever is missing" I'm building towards a sustainable future for all.
Since 2012 I've been working with Solar Fire toward initiating a global wave of solar enterpreneurship.
Building web-platforms and systems with purpose that connect actors, empower people and facilitate change.
Contact Urs Riggenbach:
info@ursrig.com, 079 918 0663
Bitte kontaktieren Sie mich:
Urs Riggenbach, info@ursrig.com, 079 918 0663
I get involved in things that make sense to me, have impact or bring learning and creation with it.
June 2012 - now
Solar thermal technology solutions in the humanitarian and industrial sector.
– Technological development, project management, IT consulting, web and communication.
June 2012 - now
Launch of innovative platform for the spread of solar thermal energy solutions.
February 2017 - May 2017
– Industrial CNC machine training (Waterjet, Lasercuting, 5 axis CNC)
– Rapid prototyping using state of the art CNC machinery
2014 - 2022
Development of custom web-platforms for the Zurich-based environmental communications agency.
2014 - now
Creation of web-development agency of impact and sustainability projects and beyond.
February 2013 - July 2013
– Development of exhibition on renewable energies.
August 2013 - February 2014
– Support in research and development.
– Instructors from the fields of architecture, construction and joinery/Carpentry
– Application of principles of sustainability and sustainable design in the architecture of a "tiny house" of 227 square feet.
– Project planning and management with different build milestones.
– Construction of entire tiny house, see it in New York Post "Tiny House 227".
– Study and implementation of HVAC systems.
September 2008 - June 2012
– Relevant Coursework: Agroecology, Economic Development, International Water Resource Management, Physics II, Collaborative Leadership, Fieldwork: Seminar in Community-based Research, Documentary Film Making, Webdesign, Fixing Food Systems, Sustainability, Local Production - Global Collaboration.
– Senior project in Nepal installing renewable energy framework at rural school
– Spanish proficiency during project-stay in Yucatán, Mexico
– Davis UWC Scholar: full scholarship awarded
September 2006 - May 2008
– International Baccelaurate (IB) with major biology and economics.
– Course language English.
– Extended essay: Sugarcane Cultivation in the Mulshi Valley, India.
– Full Scholarship from the Swiss Association for UWC
1990
I work in a network of curated developers, designers and content creators, so each project happens in a project-specific team. For the Canton of Bern I built the interactive App "Biz-Links" that helps people find the right career. As a progressive web app it is simultaneously avaiable on Android and iOS devices, as well as directly reachable in the browser by link. My full portfolio is further below.
Selection of projects in the fields of web-development, internet security, Android-app development, webstores, crowdfunding and campaign sites, online community platforms and financial modeling. Most projects are based on a security and scalability optimized tech-stack based on open-source SPIP, LXC and GNU/Linux.
Project idea and need a tech-team? Let's talk!
Posted Sunday 15 November 2020 by Urs Riggenbach.
This is post is part of a series of posts of creative writing done in the winter days of 2020.
"Much suffering is in vain and could be avoided with a dose of self-love, non violent communication and the deflection of manipulation. " - an introductory statement to the series of creative writing posts.
Posted Friday 14 February 2020 by Urs Riggenbach.
Linux allows for block level filesystem encryption, via LUKS and the cryptsetup utility. When installing Linux, disk encryption is a recommended option as it ups your data security and protection. When encrypting external drives, the drives are unreadable on Mac and Windows computers, which will then ask you if you want to format the drives. If you’ve formatted a drive by accident, do not panic, just make sure you don’t write any new data to the drive and use below steps to get your data back.
1. Search hard-drive for LUKS (missing) partition.
Substitute sdc with your hard-drive, use for example gnome-disks to identify the hard-drive path):
hexdump -C /dev/{sdc} | grep LUKS
This will output something like:
hexdump -C /dev/{sdc} | grep LUKS
2e3b5040 65 73 73 20 64 65 6e 69 65 64 00 4c 55 4b 53 ba |ess denied.LUKS.|
{{2f500000}} 4c 55 4b 53 ba be 00 01 61 65 73 00 00 00 00 00 |LUKS....aes.....|
→ If you have multiple encrypted partitions on the drive, you will get more outputs. If you just have 1 partition, you can cancel the command once you have reached the first outputs.
2. Loopmount the found partition.
Add "0x" to the location descriptor (for example: 2f500000) outputted by GREP in previous step.
losetup -o 0x{{2f500000}} -r -f /dev/{sdc}
3. Decrypt the found partition.
With the following command, it will be mounted at /dev/mapper/decrypted_partition. You will be asked for your password.
cryptsetup luksOpen /dev/loop0 decrypted_partition
4. Access the decrypted partition
For regular partitions, such as ext4, btrfs, etc, you should now see the partition in your favorite file browser, or using gnome-disks software.
If the partition contains an LVM, run:
vgchange -ay
And then check your file browser or gnome-disks software for your hard-drive. De-panic and backup your data to another disk.
Posted Monday 16 July 2018 by Urs Riggenbach.
One of the most critical factors in getting a high Google PageSpeed ranking is optimized images. Penalty is given for any image that is not compressed. This blog post will show you a framework-independent way to optimize your images with PHP and NGINX.
SPIP, Drupal, WordPress and the like store and serve uncompressed files usually from a single folder. In SPIP it is "local", in Wordpress it would be "wp-uploads". Instead of directly optimizing the source images in these folders, I’ve developed a PHP script that copies the files to a separate folder mimicking the same folder structures and filenames, and then optimizing the files. The specific software tools used for optimization are jpegtran and optipng.
Optimizing the Images
Deploy the following script in the root directory of your PHP framework.
<?php
//Optimizes images for delivery over web
function copyfile($in, $out,$outfolder) {
//check fi fiel is in destiation
if (file_exists($out)) {
$return = "already_processed";
}
else {
exec('mkdir -p ' . $outfolder);
exec('cp ' . $in . ' ' . $out);
exec('chmod 777 ' . $out);
echo "copied file: $in to $out";
$return = "not_processed";
}
if (isset($_GET['all'])) {
$return = "not_processed";
}
return $return;
}
$rii = new RecursiveIteratorIterator(new RecursiveDirectoryIterator('/var/www/html/local'));
$files = array();
foreach ($rii as $file) {
if ($file->isDir()){
continue;
}
$files[] = $file->getPathname();
}
foreach ($files as $value) {
$file_input = $value;
//regex to replace last folders
$re = '/(\/var\/www\/html\/local)/';
$subst = '/var/www/html/local_optimized';
$file_output = preg_replace($re, $subst, $value);
//regex to get folder name of file
$file_output_pathinfo = pathinfo($file_output);
$file_output_folder = $file_output_pathinfo['dirname'];
echo $file_input . "\n";
echo $file_output . "\n";
echo $file_output_folder . "\n";
if (exif_imagetype($value) == IMAGETYPE_PNG) {
echo 'The picture is a PNG...
';
$processing_status = copyfile($file_input,$file_output,$file_output_folder);
if($processing_status == "already_processed"){
echo "already processed... nothing to do. \n \n";
} else {
echo "processing now... \n \n";
$output = exec('optipng -o5 '.$file_output);
echo $output;
}
}
if (exif_imagetype($value) == IMAGETYPE_JPEG) {
echo 'The picture is a JPG...
';
$processing_status = copyfile($file_input,$file_output,$file_output_folder);
if($processing_status == "already_processed"){
echo "already processed...\n\n";
}in
else{
echo "processing now...\n\n";
$output = exec("jpegoptim --verbose --max=80 --strip-all --preserve --totals " . $file_output);
echo $output;
}
}
}
?>
You can run the script directly from the command line, such as php filename.php
or access the file over the internet via https://yourwebsite/filename.php
(however, running over CLI and automating this with a cron-job is better as there is no PHP max execution time limit on the CLI, and optimizing images is a resource heavy process).
Configuring NGINX
Next we need to configure NGINX to serve images from the "local_optimized" folder as opposed to the "local" folder. Because the above script will run as Cron Job periodically, we want to fall back to the "local" folder when the iamge cannot be found in the "local_optimized" folder (yet).
In your NGINX website conf-file, make sure you add a "location" block for your main website ("/"), and then serve the "local_optimized" folder primarily and then the original "local" folder as backup:
#your main location block
location / {
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header Host $host;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_pass https://10.0.0.10:443;
}
#your optimized image block
location /local/ {
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header Host $host;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_pass https://10.0.0.10:443/local_optimized/ ;
proxy_intercept_errors on;
recursive_error_pages on;
error_page 404 = @static_image_https;
}
#your optimized image block fallback
location @static_image_https {
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header Host $host;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_pass https://10.0.0.10:443;
}
Don’t forget to nginx -t
and service nginx reload
Validate your Setup
As mentioned in the beginning, Google PageSpeed is a great tool to validate that your images are served compressed. To validate the script working, you can simply compare the filesizes of the images in the "local" folder versus the "local_optimized" folder.