I used the Warden CLI tool to orchestrate Magento (Adobe Commerce) environments for local development use. Recently I needed to upgrade my blog and one other site, both based on WordPress. I realized that my old local development stack based on the Homebrew stopped working some time ago. This was the time to move to a more solid solution.
(more…)Wojtek Naruniec
Starting and stopping brew services using BitBar plugin
I was looking for a UI solution to easily start and stop homebrew services. Check out what I found – a handy app with ready to use plugins.
(more…)
Crazy Magento 2 core patches
Last year I took over a maintenance of the Magento 2 Commerce site. One of first tasks was adjusting the site to meet industry standards as it was a bit messy. The whole content of vendor/ directory was kept in the repository and there were about 30 patch files placed in the site root, without any information which were applied and in which order. Most of them looked as provided by M2 support. I needed to clean this up – to figure out which patches are applied, in which order they are applied and finally find a way allowing me to remove vendor/ from the repository and to be able to apply patches dynamically, during the deployment.
This was a challenging task – I needed to keep comparing the vendor/ contents with the version of this directory coming from clean Magento 2 site, and then keep applying patches until directories were the same. I was able to do this, it proved that about 25 of these patches were really applied.
Recently I wrote a blog post explaining how to deal with the second part of this challenge – check Rocket Web blog to see how to apply Magento 2 core patches.
Magento 2: add CMS Page programmatically
I’m starting series of short blog posts to share solutions for common Magento 2 problems. In first one I’m going to show how to add CMS Page programmatically in setup scripts.
How I started career thanks to Ultima Online
About 13 years ago I was in college and used to play Ultima Online. It’s a fantasy role playing game, actually it’s the first game of massively multiplayer online role-playing game genre (MMORPG). I didn’t play too much on official servers and played mostly on server emulators. They were created by a community who reverse engineered the game and created its own servers.
While playing more I was curious how it all worked so I dug a bit into emulators. I was trying to set up a server using different emulation software, trying to build sceneries and buildings. It was challenging to find online resources which could help in working with emulators. It was a few years before StackExchange was released so I could only get help on bulletin boards or on IRC channels.
At some point I gathered a bunch of Ultima Online materials and spotted a web development tutorial in a computer magazine. As I already tried programming on Amiga a couple years earlier, web development looked encouraging. I tried and soaked into that completely. Soon I created my first website and started sharing my experience and materials with other people who played and mod Ultima Online. I used PHP to run the website and stored site data in text files. I may write more about this some day.
The site was working for the next few years and became popular in the Ultima Online community in Poland. In the meantime I started working on other websites. One of these sites became my first paid web development job, and after some time web development became my main source of income.
Currently the website is offline, but Ultima Online still has active servers which could be played on. In about two weeks, on September 24th it will turn 18 years old!
It looks like it’s worth it to play games, and in general, have a hobby. Gaming led me to a career in web development which actually also became one of my hobbies.
Extension developers, test your extensions
This post isn’t about advanced testing techniques like a Test Driven Development, an unit testing or a functional testing. It’s just about spending some time to USE an extension you wrote to make sure it works in different cases.
Meet Magento 2014 Poland remarks
It’s been almost two months since I got back from the Meet Magento Poland 2014 conference. These last two months were really busy for me and a lot of things are still going on. I haven’t got a chance to write about the event yet. I decided to not write a full story, but I’m going to wrap up a random list of notes and thoughts I have gathered after the conference.
My remarks
- interesting case study of a multi-store and multi-language Magento integration was presented by LPP and Accenture
- it was hard to switch between Business and Technology tracks and I lost more than half of Kuba Zwoliński presentation about iBeacon. I hope to see videos from the conference sometime soon!
- there was a nice introduction to Magento 2 caching by Marko Martinović. I was surprised that the Magento 2 Community Edition includes Full Page Cache support
- good points on contributing to open source Magento extensions and projects in general by Tsvetan Stoychev from Jarlssen. I still need to review extensions published on Jarlssen github
- I couldn’t watch Damian Luszczymuk’s presentation about Docker as I chose the Flexible Billing talk by our CEO, Matt MacDougall. However I was really lucky and Damian showed me the presentation in brief later that day. I promise to dig into Docker soon.
- there was an interesting case study on Magento-SAP integration made for Mennica Polska by Robert Żochowski from Bold Agency. I’m excited to buy some gold from a store based on Magento.
- I got some interesting ideas about introducing developers to Magento development and noted a bunch of training materials from Ben Marks presentation
- Daniel Sloof made a good overview on HHVM and I’m curious to see Magento 2 running on that engine
- our booth with space figures and NASA suits was very successful, bringing a lot of people to our place!
- I missed the presentation about Magento indexers by Maciej Ostrowski and I hope to see the video soon
- I saw a cool demo of cobby.io, tool which allows me to manage product data in Excel. It sounds like a crazy idea but looks really interesting and works nice
- community dinner at Podwale 25 restaurant was delicious. I hope to visit that cool place next time I’m in Warsaw
- enjoyed Thomas Goletz’s story about Gobi desert race and about Chinese Magento branch
- conference iPhone application was really cool, however iBeacon stuff didn’t work on my iPhone. It’s time to move to the new one
These are just random thoughts I got after the conference. I really liked the event and I hope to go to Meet Magento 2015 later this year :-)
Photo made by Viacheslav Kravchuk, Atwix. Thanks!
How to prepare Magento 2 beta package for offline use
Magento 2 comes with a composer installer and all external dependencies including sample data are being installed using composer. However, I needed to have a simple way to install Magento 2 along with sample data in an offline environment, without using composer. I had a few reasons to do this – I wanted to have a fast way to install Magento 2 multiple times and I wanted to test command line installing for MageTesting.com purposes.
Main goals are:
- avoid downloading more than 1 GB of data each time
- let it work in offline mode
- operate with smaller packages
- simplify steps needed to install Magento 2
Cloning GIT repository and downloading dependencies resulted in downloading more than 1GB of data:
- Magento 2 code cloned with packages downloaded using composer: 471.3 MB (194 MB after gzipping)
- sample data media: 590.9 MB (zipped)
- sample data code: 0.2 MB (zipped)
I decided to prepare a Magento 2 package which contains only code needed to run application and to prepare sample data package which could be installed just by copy pasting that into Magento 2. Recently I was playing with a sample data compression script provided by Vinai Kopp, and I made a fork which can compress Magento 2 sample data.
At the end I have the following packages:
- Magento 2 code (26 MB, gzipped)
- compressed sample data media (92MB, zipped)
- sample data code: 0.2 MB (zipped)
I know there is a composer cache. I know I could use Vagrant/Docker or other virtualization, but still I wanted to avoid overcomplicating the process. If you see that use case useful, please find all needed steps described below.
Just keep in mind it is written for 0.42.0-beta1 release of Magento 2 and it is not a recommended way to install Magento 2.
Prepare Magento 2 package
1. Clone GIT repository
git clone git@github.com:magento/magento2.git
2. Install composer dependencies
composer.phar update
3. Remove huge directories not needed to run application
rm -rf dev/tests rm -rf .git rm -rf vendor/magento/zendframework1/documentation rm -rf vendor/magento/zendframework1/tests rm -rf vendor/magento/zendframework1/demos
4. Prepare package
tar czf magento2-0.42.0-beta1.tar.gz -C magento2/ .
Prepare Magento 2 sample data package
1. Downlod demo data
curl -O http://packages.magento.com/_packages/magento_sample-data-0.42.0-beta1.zip curl -O http://packages.magento.com/_packages/magento_sample-data-media-0.42.0-beta1.zip
2. Compress demo data
compress-sample-data-magento2.sh magento_sample-data-media-0.42.0-beta1.zip
Install Magento 2 using created package
1. Prepare directory and unpack package:
mkdir magento2 tar xzf magento2-0.42.0-beta1.tar.gz -C magento2/
2. Set required permissions
cd magento2 chmod -R 777 var/ chmod -R 777 pub/media/ chmod -R 777 pub/static chmod -R 777 app/etc/
3. Run Setup
php -f setup/index.php install --base_url=http://local.magento2new.com/ --backend_frontname=admin --db_host=127.0.0.1 --db_name=mage2 --db_user=mage2 --db_pass=mage2 --admin_firstname=John --admin_lastname=Doe --admin_email=john@example.com --admin_username=admin --admin_password=admin --language=en_US --currency=USD --timezone=Europe/Warsaw
Install sample data using package
1. Unpack media sample data
unzip -q -d pub/media/ compressed-magento_sample-data-media-0.42.0-beta1.zip
2. Unpack sample data code
mkdir dev/tools/Magento/Tools/SampleData unzip -q -d dev/tools/Magento/Tools/SampleData magento_sample-data-0.42.0-beta1.zip
3. Install sample data
php -f dev/tools/Magento/Tools/SampleData/install.php -- --admin_username=admin
4. Make sure newly added files are writable:
chmod -R 777 pub/media/ chmod -R 777 pub/static
This one is a little dirty, but as far as I know composer doesn’t support installing local packages.
Let me know if you find this article useful and if you have any thoughts around that.
How to easily dump Magento database with n98-magerun
It looks I felt in love with a n98-magerun tool. I already talked about my favourite n98-magerun commands and about a command which allows to generate fake customer data. Today I’m going to continue n98-magerun post series and focus on a command which allows to make database dump very easily. n98-magerun.phar db:dump in addition to commands mentioned in previous blog posts is another one must have.
The command allows to dump a database very easily. Similar to a n98magerun mysql-client command, it doesn’t require me to enter a password and look for any connection details. It automatically generates a dump file name based on a current date and time, allows to use built-in filters to exclude big utility tables and finally creates an archive after completing the dump.
Sample calls may look as follows:
n98-magerun.phar db:dump n98-magerun.phar db:dump --strip "@stripped @ee_changelog @idx" n98-magerun.phar db:dump --strip "@stripped @ee_changelog @idx" --compression=gz
First call produced a 5.1 GB file containing all database tables. Second stripped a bunch of database tables such as changelog tables, index tables, reports or logs tables and it resulted in reducing the database dump to 3.1 GB. Third one called with a compression option reduced size to about 251 MB. Third gain is expected and it could be also achieved with one additional gz command call, but it’s really convenient to do a gzipped dump in one command call.
Stripping not needed database tables can save gigabytes of transfer when working with a big database. However, it won’t fit all use cases, for example in which you need to prepare a dump to debug index problems where you need to get an exact state of a Magento application database.