Simplify local development with Drude

Posted by FFW Agency on May 4, 2016 at 3:16pm
Simplify local development with Drude David Hernandez Wed, 05/04/2016 - 15:16

As one of the largest Drupal agencies in the world FFW is no stranger to problems of scale. With large numbers of technical staff, clients, and concurrent projects, workflow management is vitally important to our work. And to deliver projects on time, while managing resources with agility, consistency and simplicity in the tools we choose plays a huge part.

When there are no standards for the tools a team uses (OS, editor, server, php version, etc.) dealing with the toolset adds unnecessary overhead that can eat away development time. You'll quickly find that setting up projects, on-boarding developers, troubleshooting, and even training all become more difficult as you deal with larger projects, larger teams, and more complex requirements.

To help solve these problems FFW created Drude.

What is Drude?

Drude (Drupal Development Environment) is a management tool for defining and managing development environments. It brings together common development tools, minimizes configuration, and ensures environment consistancy everywhere in your continuous integration worlflow. It automatically configures each project's environment to ensure team members are using the same tools, and versions, regardless of the individual requirements for each project. Most importantly, it makes the entire process easy.

With Drude you get fully containerized environments with Docker, cross-platform support (MacOS, Windows, and Linux,) built-in tools like drush, Drupal Console, composer, and PHP Code Sniffer, plug and play services like Apache Solr, Varnish, and Memcache, and even built-in testing support using Behat and Selenium. Drude will even automatically configure virtual hosts for you, so no more editing host files and server configurations.

With all of this you also get a management tool, which is the heart of Drude. dsh is a command line tool for controlling all aspects of your project's environment. You can use it to stop and start containers, interact with the host virtual machine, use drush and console to execute commands directly against your Drupal sites, and manage the installation and updating of projects.

Let's see how this works

Download the Drude shell command

​sudo curl -L https://raw.githubusercontent.com/blinkreaction/drude/master/bin/dsh -o /usr/local/bin/dsh sudo chmod +x /usr/local/bin/dsh

You can now use the dsh command.  Use it to install prerequisites, which includes Docker, Vagrant, and VirtualBox.

dsh install prerequisites
dsh install boot2docker

These are all one-time steps for setting up Drude. Now that's done, you only need to set up individual projects. To demonstrate how this works we have Drupal 7 and 8 test projects available. Check their GitHub pages for additional setup instructions, in case the below instructions don’t work for you.

https://github.com/blinkreaction/drude-d7-testing
https://github.com/blinkreaction/drude-d8-testing

Setting up a project.

Clone the Drupal 8 test project.

git clone https://github.com/blinkreaction/drude-d8-testing.git
cd drude-d8-testing

Use the init command to initialize local settings and install the Drupal site via Drush.

dsh init

Starting containers...
Creating druded8testing_db_1
Creating druded8testing_cli_1
Creating druded8testing_web_1
...
Installing site...
Installation complete.
User name: admin User password: 5r58daY2vZ [ok]
Congratulations, you installed Drupal!
[status]
real 1m18.139s
user 0m0.300s
sys 0m0.174s
...
Open http://drupal8.drude in your browser to verify the setup.

The init script automates provisioning, which can be modified per project. It can initialize settings for provisioned services, import databases, install sites, compile Sass, revert features, enable or disable modules, run Behat tests, and many other things.

Now, simply point your browser to http://drupal8.drude

Drupal 8 first install home page

That’s it! Any time a team member wants to participate in a project all they have to do is download the project repo and run the init command. And with the environments containerized, they can be deployed anywhere.

Why publicize all this?

Clearly, we've put in a lot of work building a great tool. One that we could easily keep to ourselves. Well, at FFW we are huge supporters of open-source. As one of the main supporters of the Drupal Console project, and a major supporter of Drupal, we believe that benefiting the community as a whole benefits us exponentially in return. We encourage anyone to use this tool, provide feedback, and even contribute to the project.

Shipping vessel at port Tagged with Comments

Both Sides Now: What I Learned When I Jumped from the Supplier Side to the Client Side on the Same Project

Posted by Acquia Developer Center Blog on May 4, 2016 at 2:51pm
both sides

Shortly after graduating, I found myself an internship with miggle, a UK-based Web development specialist that uses Drupal exclusively.

As a web production assistant, my primary role required me to look after the latter stages of a new website rebuild for an organization in the education sector.

Tags: acquia drupal planet

Amazee.io goes Drop Guard: for developers, for security

Posted by Drop Guard on May 4, 2016 at 12:30pm
Amazee.io goes Drop Guard: for developers, for security Johanna Anthes Wed, 04.05.2016 - 14:30  Amazee.io has a full integration into Drop Guard

2nd May 2016: amazee.io  just launched their Drupal hosting platform built for develeopers, which has a full integration into Drop Guard. And that’s when our common story started.

The Amazee team just dedicate themselves to the Drupal world: “We’re a secure, high-performance, cloud-based hosting solution built for folks who love their Drupal sites as much as we do”.

Drupal Planet Drupal Success Story Amazee Security Hosting

Restricting Access to Content in Drupal 8

Posted by OSTraining on May 4, 2016 at 10:03am

One of our OSTraining members wanted to restrict access to certain content on his Drupal 8 site.

To do this in Drupal 8, we are going to use the Content Access module.

To follow along with this tutorial, download and install Content Access. I found no errors while using this module, but please note that currently it is a dev release.

An overview of web service solutions in Drupal 8

Posted by Dries Buytaert on May 4, 2016 at 9:48am

Today's third-party applications increasingly depend on web services to retrieve and manipulate data, and Drupal offers a range of web services options for API-first content delivery. For example, a robust first-class web services layer is now available out-of-the-box with Drupal 8. But there are also new approaches to expose Drupal data, including Services and newer entrants like RELAXed Web Services and GraphQL.

The goal of this blog post is to enable Drupal developers in need of web services to make an educated decision about the right web services solution for their project. This blog post also sets the stage for a future blog post, where I plan to share my thoughts about how I believe we should move Drupal core's web services API forward. Getting aligned on our strengths and weaknesses is an essential first step before we can brainstorm about the future.

The Drupal community now has a range of web services modules available in core and as contributed modules sharing overlapping missions but leveraging disparate mechanisms and architectural styles to achieve them. Here is a comparison table of the most notable web services modules in Drupal 8:

Feature Core REST RELAXed Services Content entity CRUD Yes Yes Yes Configuration entity CRUD Create resource plugin (issue) Create resource plugin Yes Custom resources Create resource plugin Create resource plugin Create Services plugin Custom routes Create resource plugin or Views REST export (GET) Create resource plugin Configurable route prefixes Renderable objects Not applicable Not applicable Yes (no contextual blocks or views) Translations Not yet (issue) Yes Create Services plugin Revisions Create resource plugin Yes Create Services plugin File attachments Create resource plugin Yes Create Services plugin Shareable UUIDs (GET) Yes Yes Yes Authenticated user resources (log in/out, password reset) Not yet (issue) No User login and logout

Core RESTful Web Services

Thanks to the Web Services and Context Core Initiative (WSCCI), Drupal 8 is now an out-of-the-box REST server with operations to create, read, update, and delete (CRUD) content entities such as nodes, users, taxonomy terms, and comments. The four primary REST modules in core are:

  • Serialization is able to perform serialization by providing normalizers and encoders. First, it normalizes Drupal data (entities and their fields) into arrays with a particular structure. Any normalization can then be sent to an encoder, which transforms those arrays into data formats such as JSON or XML.
  • RESTful Web Services allows for HTTP methods to be performed on existing resources including but not limited to content entities and views (the latter facilitated through the “REST export" display in Views) and custom resources added through REST plugins.
  • HAL builds on top of the Serialization module and adds the Hypertext Application Language normalization, a format that enables you to design an API geared toward clients moving between distinct resources through hyperlinks.
  • Basic Auth allows you to include a username and password with request headers for operations requiring permissions beyond that of an anonymous user. It should only be used with HTTPS.

Core REST adheres strictly to REST principles in that resources directly match their URIs (accessible via a query parameter, e.g. ?_format=json for JSON) and in the ability to serialize non-content into JSON or XML representations. By default, core REST also includes two authentication mechanisms: basic authentication and cookie-based authentication.

While core REST provides a range of features with only a few steps of configuration there are several reasons why other options, available as contributed modules, may be a better choice. Limitations of core REST include the lack of support for configuration entities as well as the inability to include file attachments and revisions in response payloads. With your help, we can continue to improve and expand core's REST support.

RELAXed Web Services

As I highlighted in my recent blog post about improving Drupal's content workflow, RELAXed Web Services, is part of a larger suite of modules handling content staging and deployment across environments. It is explicitly tied to the CouchDB API specification, and when enabled, will yield a REST API that operates like the CouchDB REST API. This means that CouchDB integration with client-side libraries such as PouchDB and Hood.ie makes possible offline-enabled Drupal, which synchronizes content once the client regains connectivity. Moreover, people new to Drupal with exposure to CouchDB will immediately understand the API, since there is robust documentation for the endpoints.

RELAXed Web Services depends on core's REST modules and extends its functionality by adding support for translations, parent revisions (through the Multiversion module), file attachments, and especially cross-environment UUID references, which make it possible to replicate content to Drupal sites or other CouchDB compatible services. UUID references and revisions are essential to resolving merge conflicts during the content staging process. I believe it would be great to support translations, parent revisions, file attachments, and UUID references in core's RESTful web services — we simply didn't get around to them in time for Drupal 8.0.0.

Services

Since RESTful Web Services are now incorporated into Drupal 8 core, relevant contributed modules have either been superseded or have gained new missions in the interest of extending existing core REST functionality. In the case of Services, a popular Drupal 7 module for providing Drupal data to external applications, the module has evolved considerably for its upcoming Drupal 8 release.

With Services in Drupal 8 you can assign a custom name to your endpoint to distinguish your resources from those provisioned by core and also provision custom resources similar to core's RESTful Web Services. In addition to content entities, Services supports configuration entities such as blocks and menus — this can be important when you want to build a decoupled application that leverages Drupal's menu and blocks system. Moreover, Services is capable of returning renderable objects encoded in JSON, which allows you to use Drupal's server-side rendering of blocks and menus in an entirely distinct application.

At the time of this writing, the Drupal 8 version of Services module is not yet feature-complete: there is no test coverage, no content entity validation (when creating or modifying), no field access checking, and no CSRF protection, so caution is important when using Services in its current state, and contributions are greatly appreciated.

GraphQL

GraphQL, originally created by Facebook to power its data fetching, is a query language that enables fewer queries and limits response bloat. Rather than tightly coupling responses with a predefined schema, GraphQL overturns this common practice by allowing for the client's request to explicitly tailor a response so that the client only receives what it needs: no more and no less. To accomplish this, client requests and server responses have a shared shape. It doesn't fall into the same category as the web services modules that expose a REST API and as such is absent from the table above.

GraphQL shifts responsibility from the server to the client: the server publishes its possibilities, and the client publishes its requirements instead of receiving a response dictated solely by the server. In addition, information from related entities (e.g. both a node's body and its author's e-mail address) can be retrieved in a single request rather than successive ones.

Typical REST APIs tend to be static (or versioned, in many cases, e.g. /api/v1) in order to facilitate backwards compatibility for applications. However, in Drupal's case, when the underlying content model is inevitably augmented or otherwise changed, schema compatibility is no longer guaranteed. For instance, when you remove a field from a content type or modify it, Drupal's core REST API is no longer compatible with those applications expecting that field to be present. With GraphQL's native schema introspection and client-specified queries, the API is much less opaque from the client's perspective in that the client is aware of what response will result according to its own requirements.

I'm very bullish on the potential for GraphQL, which I believe makes a lot of sense in core in the long term. I featured the project in my Barcelona keynote (demo video), and Acquia also sponsored development of the GraphQL module (Drupal 8 only) following DrupalCon Barcelona. The GraphQL module, created by Sebastian Siemssen, now supports read queries, implements the GraphiQL query testing interface, and can be integrated with Relay (with some limitations).

Conclusion

For most simple REST API use cases, core REST is adequate, but core REST can be insufficient for more complex use cases. Depending on your use case, you may need more off-the-shelf functionality without the need to write a resource plugin or custom code, such as support for configuration entity CRUD (Services); for revisions, file attachments, translations, and cross-environment UUIDs (RELAXed); or for client-driven queries (GraphQL).

Special thanks to Preston So for contributions to this blog post and to Moshe Weitzman, Kyle Browning, Kris Vanderwater, Wim Leers, Sebastian Siemssen, Tim Millwood and Ted Bowman for their feedback during its writing.

Drupal Watchdog Joins the Linux New Media Family

Posted by Drupal Watchdog on May 4, 2016 at 6:04am
Drupal Watchdog 6.01 is the first issue published by Linux New Media.

Drupal Watchdog 6.01 is the first issue published by Linux New Media. Come see the Drupal Watchdog team at DrupalCon 2016!

Drupal Watchdog was founded in 2011 by Tag1 Consulting as a resource for the Drupal community to share news and information. Now in its sixth year, Drupal Watchdog is ready to expand to meet the needs of this growing community.

Drupal Watchdog will now be published by Linux New Media, aptly described as the Pulse of Open Source.

“It’s very clear that the folks at Linux New Media know what they’re doing, and that they truly value the open source culture,” said Jeremy Andrews, CEO/Founding Partner, Tag1 Consulting. “I’m ecstatic that the magazine will not just live on, but it will thrive as a quarterly publication … this is a wonderful step forward that benefits everyone who reads and contributes to Drupal Watchdog.”

The magazine will continue to be offered in print and digital formats, and Linux New Media’s international structure provides better service to subscribers worldwide, with local offices in North America and Europe and ordering options in various local currencies.

“We don’t want to change what has brought Drupal Watchdog this far, but we do want to see it grow and expand to the next level, which mainly means – extending the reach of the magazine,” said Brian Osborn, CEO and Publisher, Linux New Media. “As our first step, Drupal Watchdog will now be published quarterly, helping us stay even more current in our coverage and in more frequent contact with our readership.”

Drupal Watchdog is written for the Drupal community and will only thrive through community participation.

Here is what you can do to help:

The first issue of Drupal Watchdog published by Linux New Media will be available May 9th! All DrupalCon attendees will receive a copy at the event. Come meet the new team, and learn more about the future of Drupal Watchdog!

Images: 

Enabling Fancy Attributes in Commerce 2.x

Posted by Drupal Commerce on May 3, 2016 at 10:20pm

In Drupal Commerce 1.x, we used the Commerce Fancy Attributes and Field Extractor modules to render attributes more dynamically than just using simple select lists. This let you do things like show a color swatch instead of just a color name for a customer to select.


Fancy Attributes on a product display in Commerce Kickstart 2.x.

In Commerce 2.0-alpha4, we introduced specific product attribute related entity types. Building on top of them and other contributed modules, we can now provide fancy attributes out of the box! When presenting the attribute dropdown, we show the labels of attribute values. But since attribute values are fieldable, we can just as easily use a different field to represent it, such as an image or a color field. To accomplish this, we provide a new element type that renders the attribute value entities as a radio button option.

Read more to see an example configuration.

Performance boosting in Drupal 8 just got better

Posted by Finalist Drupal Blog on May 3, 2016 at 10:00pm

With the release of Drupal 8.1 on April 20th the BigPipe module was added to core to increase the speed of Drupal for anonymous and logged in visitors.

What does BigPipe do in general?

BigPipe is a technique to render a webpage in phases. It uses components to create the complete page ordered by the speed of the components themselves. This technique gives the visitors a feeling that the website is faster then it may actually be. Thus giving a boost in user experience.

This technique, originally developed by Facebook, deploys the theory of multi threading, just like processors do. It disperses multiple calls to a single backend to make full use of the web server and thus rendering a webpage faster then conventional rendering does.

What does BigPipe do in Drupal?

For “normal” websites with anonymous visitors, BigPipe doesn’t do much. If you use a caching engine like Varnish, or even Drupal cache itself, pages are generally rendered fast enough. When using dynamic content like lists of related, personalized or localized content BigPipe can kick in and really make a difference. When opening the website BigPipe returns the page skeleton that can be cached. Elements like the menus, footer, header and often even content. And then rendering of the dynamic content will start. This means that the visitor of your website is already reading the most import content, and is able to see the dynamic related list later on after it’s loaded asynchronously.

For websites with logged in users BigPipe can be a real boost in performance. Standard Drupal cache doesn’t work out of the box for logged in users. For Drupal 7 you had the Authenticated User Page Caching (Authcache) module (which had some disadvantages), but for Drupal 8 there was nothing. Until Drupal 8.1!

With BigPipe Drupal is now able to cache certain parts of the page (the skeleton which I mentioned above) and to multithread some other parts. And these parts are cacheable by themselves.

Click on the imgae below to watch a video on YouTube where you can see for yourself how much effect BigPipe can have. BigPipe in Drupal 8

Video is made by Dries Buytaert

BigPipe in Drupal

As I said, starting from Drupal 8.1. BigPipe is added as a core module. And everybody can use it. Whether you are using a budget hosting platform or you are hosting your own website with state-of-the-art servers, it is basically just one (1) click away. You can just enable the module and get all the benefits BigPipe has to offer!

Drupal 8 Module of the Week: Display Suite

Posted by Acquia Developer Center Blog on May 3, 2016 at 6:33pm
Drupal 8 logo

Each day, more Drupal 7 modules are being migrated over to Drupal 8 and new ones are being created for the Drupal community’s latest major release. In this series, the Acquia Developer Center is profiling some of the most most prominent, useful modules, projects, and tools available for Drupal 8. This week: Display Suite.

Tags: acquia drupal planetdsdisplay suitedrag and dropuiUX

Moving an existing site into Platform.sh

Posted by Nacho Digital on May 3, 2016 at 6:26pm
A walk through my first Platform experience with steps on how to avoid problems if you need to upload your local version into it. Lots of hints to get it right the first time!

If you are in a hurry and only need a recipe, please head to the technical part of the article, but I would like to start sharing a bit of my experience first because you might be still deciding if Platform is for you.

I decided to try Platform because a friend of mine needed a site. Do to several reasons I didn't want to host it on my personal server. But I didn't want to run a server for him either. I wanted to forget about maintaining the server, keeping it secure or do upgrades to it.

So I started thinking about options for small sites:

The Art of Sketching before Wireframing. OK, it’s not really Art!

Posted by Bluespark Labs on May 3, 2016 at 5:41pm

So you’ve got a great idea. Spent months thinking about it. Sold the idea to internally key stakeholders. Grabbed the attention of the right people, organized a team and managed to get funding. You’ve selected your agency and have gone through a discovery process and are ready to design out the idea. Now what?

Well, you start by sketching of course. Yup I said it, we start by drawing pretty pictures (well not so pretty really).

The power of sketching. There’s no need for commitment here.

You may think you don’t need to sketch because you already know how you want the interface to look. But often times when you actually start sketching, you’ll realize that the path that you were so set on, might not work the best.

Sketching sets the tone for the rest of the design process. It ensures you’re creating a user experience that meets both user and stakeholder goals and objectives. Removing this step from the process puts you at a disadvantage as you’re more likely to get locked into a design because it’s more difficult to make quick iterations using software built for wireframing and design comps. Sketching allows you to visualize what an interface could become without committing to anything.

Sketching clutter is a means to an end

Initial sketches will likely uncover that your trying to cram too much onto the user’s screen, but that’s OK. We’re trying to uncover all possibilities so we can iterate quickly.

Having a UX team take an outside-in approach can really help define what you’re trying to achieve without overwhelming the user.

We’ve found that sketching the pages/concepts can be beneficial in a number of ways:

  1. Speeds up the discovery phase by allowing all members of the team to get their thoughts on paper and get buy-in from key stakeholders

  2. Allows the team to iterate quickly on the structure of the site/application without focusing on design elements such as colors, fonts, imagery, etc.

  3. Offers a quick frame of reference to have early implementation discussions with developers on the project

  4. Offers the ability to highlight key areas for measurement to ensure we’re meeting business and project objectives

  5. Offers the ability to test real users with paper prototypes without writing a single line of code

Sketching enables you to work faster & iterate quickly

Start with drawing the high level elements on the page such as the main navigation, secondary navigation, footer elements and high level links. But also try to think about the positioning of elements on the page. Most users read left to right and top to bottom. Keeping that in mind we can guide the user’s eyes to elements on the page by highlighting elements with design characteristics such as color, graphics, etc..

Moving some navigational aids into the secondary nav or the footer doesn’t mean they’re less important, but it does allow us to simplify the interface and add clarity for users to achieve their online goal.

Sketching helps you brainstorm ideas

One of the biggest advantages of sketching is that everyone can do it. From designers to the director of human resources at your company (you don’t have to be an artist). So don’t be afraid to sketch out your ideas.

Sketching is an efficient way to get the ideas out of your head and out in the open for discussion. It keeps you from getting caught up in the technology, and instead focuses you on the best possible solution, freeing you to take risks that you might not otherwise take.

Getting everyone involved in this stage can be incredibly valuable for a couple of reasons. You can quickly get a good grasp of what you’re envisioning while gaining an understanding of the development process and interaction requirements, as you’re guided through the process.

What gets designed on the front end has a back end component that most clients don’t understand. Working with a UX team gives you the opportunity to gain that understanding while contributing with feedback that moves the project forward.

Sketching a UI develops multi-dimensional thinking

Designing a user interface is a process. Translating an idea to meet user requirements requires multi-dimensional thinking. Sketching a user interface is primarily a 2 dimensional process, but as UX professionals we need to consider a number of factors:

  1. What is the user trying to accomplish?

  2. How is the user interacting with the site/application (desktop, mobile, kiosk, device specific, etc.)?

  3. How does the UI react as the user interacts with it?

  4. What appears on each of the pages as content and navigational aids?

  5. What if a user encounters an error? Are there tools to help them recover?

Sketching allows you to visualize the screen-to-screen interaction so that your idea is something that’s visible and clear in user interface form. Ultimately helping you move the project to the next level.

Take your sketches up a notch with interactivity

Lastly, using an online prototyping tool offers the ability to upload the sketches and add hotspots over the navigation and linking aids. This allows you to click through on rough sketches as if it were a real functioning website (a really ugly website). I can’t tell you the number of times I’ve worked on a series of sketches and didn’t realize that I was missing a major element or interaction until i added hotspots and tried to use it.

The design phase beginning with the initial sketches is a way to envision an interface that meets measurable goals. The ultimate goal is to align key business objectives with user goals. When those two things align you’ve got a website or product that’s bound to succeed.

Tags: Drupal PlanetUXUIDesignwireframesrapid iterative

My Drupal 8 Learning Path: Configuration Management in D8

Posted by Acquia Developer Center Blog on May 3, 2016 at 4:30pm
My Drupal 8 Path

Recently, fellow Acquian Tanay Sai published a blog with a link to the activity cards he and other members of the Acquia India team have been following to learn Drupal 8.

Each card is a self contained lesson on a single topic related to Drupal 8: with a set objective, steps to complete the learning exercise, links to blogs, documentation, and videos to get more information.

Tags: acquia drupal planet

Set up a faceted Apache Solr search page on Drupal 8 with Search API Solr and Facets

Posted by Jeff Geerling's Blog on May 3, 2016 at 3:32pm

In Drupal 8, Search API Solr is the consolidated successor to both the Apache Solr Search and Search API Solr modules in Drupal 7. I thought I'd document the process of setting up the module on a Drupal 8 site, connecting to an Apache Solr search server, and configuring a search index and search results page with search facets, since the process has changed slightly from Drupal 7.

Install the Drupal modules

In Drupal 8, since Composer is now a de-facto standard for including external PHP libraries, the Search API Solr module doesn't actually include the Solarium code in the module's repository. So you can't just download the module off Drupal.org, drag it into your codebase, and enable it. You have to first ensure all the module's dependencies are installed via Composer. There are two ways that I recommend for doing this (both are documented in the module's issue: Keep Solarium managed via composer and improve documentation):

The Secret Sauce podcast, Ep. 16: Finding Your Purpose as a Drupal Agency

Posted by Palantir on May 3, 2016 at 3:03pm

CEO and Founder George DeMet shares a continuation of ideas presented at DrupalCon Barcelona with his new talk on the benefits of running a company according to a set of clearly defined principles, which he's presenting next week at DrupalCon New Orleans. It's called Finding Your Purpose as a Drupal Agency.

iTunes | RSS Feed | Download | Transcript

We'll be back next Tuesday with another episode of the Secret Sauce and a new installment of our long-form interview podcast On the Air With Palantir next month, but for now subscribe to all of our episodes over on iTunes.

Want to learn more? We have built Palantir over the last 20 years with you in mind, and are confident that our approach can enhance everything you have planned for the web.


Transcript


Allison Manley [AM]: Hi, and welcome to the Secret Sauce by Palantir.net. This is our short podcast that gives quick tips on small things you can do to make your business run better. I’m Allison Manley, an account manager here at Palantir, and today’s advice comes from George DeMet, our Founder and CEO, who as a small business owner knows a thing or two about how to run a company based on clearly defined principles.

George DeMet [GD]: My name is George DeMet, and I’m here today to talk about the benefits of running a company according to a set of clearly defined principles. What follows is taken from a session that I presented last fall at DrupalCon Barcelona on Architecting Companies that are Built to Last.

At the upcoming DrupalCon New Orleans in mid-May, I’ll be continuing this conversation in an all- new session called Finding Your Purpose as a Drupal Agency. If you’re able to attend Drupalcon New Orleans, I hope you’ll check it out.

Some time back I came across an article from the early 1970s about my grandfather, who was also named George DeMet. He was a Greek immigrant who spent more than 60 years running several candy stores, soda fountains, and restaurants in Chicago. While the DeMet’s candy and restaurant business were sold decades ago, the brand survives to this day and you can still buy DeMet’s Turtles in many grocery stores.

I never really got to know my grandfather, who died when I was 7 years old, but I have heard many of the stories that were passed down by my grandmother, my father, and other members of the family.

And from those stories, I’ve gotten a glimpse into some of the principles and values that helped make that business so successful for so long. Simple things, like honesty, being open to new ideas, listening to good ideas from other people, and so forth.

And as I was thinking about those things, I started doing some research into the values that so-called family businesses have in general, and that some of the oldest companies in history have in particular.

The longest lasting company in history was Kongo Gumi, a Japanese Buddhist temple builder that was founded in the year 578 and lasted until 2006. At the time that Kongo Gumi was founded, Europe was in the middle of the dark ages following the fall of the Roman Empire, the prophet Muhammed was just a child, the Mayan Empire was at its peak in Central America, and the Chinese had just invented matches.

At some point in the 18th century the company’s leadership documented a series of principles that were used by succeeding generations to help guide the company.

This included advice that’s still relevant to many companies today, like:

  • Always use common sense
  • Concentrate on your core business
  • Ensure long-term stability for employees
  • Maintain balance between work and family
  • Listen to your customers and treat them with respect
  • Submit the cheapest and most honest estimate
  • Drink only in moderation

Even though the Buddhist temple construction and repair business is a pretty stable one, they still had to contend with a lot of changes over their 1,400 year history. Part of what helped was that they had unusually flexible succession planning; even though the company technically was in the same family for 40 generations, control of the company didn’t automatically go to the eldest son; it went to the person in the family who was deemed the most competent, and sometimes that person was someone who was related by marriage.

Kongo Gumi not only only built temples that were designed to last centuries, but they also built relationships with their customers that lasted for centuries.

In the 20th century, Kongo Gumi branched out into private and commercial construction, which helped compensate for the decline in the temple business. They also didn’t shy away from changes in technology; they were the first in Japan to combine traditional wooden construction with concrete, and the first to use CAD software to design temples.

And while Kongo Gumi’s business had declined as they entered the 21st century, what ultimately did them in were speculative investments that they had made in the 80’s and early 90s in the Japanese real estate bubble.

Even though they were still earning more than $65 million a year in revenue in the mid-2000s, Kongo Gumi was massively over-leveraged and unable to service the more than $343 million in debt they had accumulated since the collapse of the bubble, and they ended up being absorbed by a larger construction firm.

Principles are designed to help answer the question of *how* a company does things, and what criteria they should use to make decisions. In the end, Kongo Gumi was no longer able to survive as an independent entity after 1,400 years in business not because of economic upheaval or changes in technology, but because they strayed from their core principles, stopped taking the long view, and went for the quick cash.

Companies that want to be successful in the long run need to identify their core principles and stick to them, even when doing so means passing up potentially lucrative opportunities in the short term.

Regardless of whether the business involves building Buddhist temples, making chocolate-covered pecans, or building websites, a focus on sustainability over growth encourages companies to put customers and employees first, instead of shareholders and investors. These kinds of companies are uniquely positioned to learn from their failures, build on success, and learn how to thrive in an ever-changing business landscape.

AM: Thank you George! George will be presenting his session, Finding Your Purpose as a Drupal Agency at DrupalCon New Orleans on Wednesday, May 11. You can find out more on our website at palantir.net and in the notes for this particular podcast episode.

If you want to see George’s presentation from DrupalCon Barcelona last year on Architecting Drupal Businesses that are Built to Last, you can also find that link in the notes for this episode as well.

For more great tips, follow us on Twitter at @palantir, or visit our website at palantir.net. Have a great day!

How To Create Custom SOLR Search With Autocomplete In Drupal 7

Posted by Valuebound on May 3, 2016 at 11:35am

In many cases, users visiting a site already know what they are looking for, hence they  head straight to the search box. Since it is likely to be their first point of contact with the website, retailers must ensure that they get it right the first time. This is to avoid the issue of users getting frustrated by inaccurate or badly-ranked results and as a result moving on to a different site.

A good starting point is to introduce a ‘suggest’(Autocomplete) function, which lists a drop-down menu of various  search queries containing the text fragment they have typed.

We will integrate the apache solr with our drupal site and make autocomplete search.
For this, we need to query the apache solr and then get the results from the apache finally  displaying it in…

Improve The Drupal 8 Admin Menu for Content Creators

Posted by OSTraining on May 3, 2016 at 10:52am

The Drupal admin interface needs to keep a lot of people happy. The admin interface is often used by everyone from very experienced users to complete beginners.

One of our members asked if it was possible to create a custom menu for their content creators. They wanted one single place for Drupal beginners to find all the links they needed.

In this tutorial, we'll show you how to do that and also create a faster, more usable admin menu.

How I learned to stop worrying and love custom migration classes

Posted by Red Route on May 3, 2016 at 8:40am

When I got sick of banging my head against the migration-shaped wall the other day, the state of my attempts to migrate content from Drupal 6 was fairly limited.

Migrate Upgrade was working fairly well, up to a point.

Gallery nodes had been migrated, but without their addresses, which is hardly surprising, seeing as the D6 site was using the Location module, and I've decided to go with Geolocation and Address for the new site.

Exhibition nodes had been migrated, but without the node references to galleries. There's an issue with a patch on drupal.org for this, but after applying the patch, files weren't being migrated.

Time to get stuck in and help fix the patch, I thought. But the trouble is that we're dealing with a moving target. With the release of Drupal 8.1.0, the various migrate modules are all changing as well, and core patches shouldn't be against the 8.0.x branch anymore. It's all too easy to imagine that updating to the latest versions of things will solve all your problems. But often you just end up with a whole new set of problems, and it's hard to figure out where the problem is, in among so much change.

Luckily, by the time I'd done a bit of fiddling with the theme, somebody else had made some progress on the entity reference migration patch, so when I revisited the migrations, having applied the new version of the patch, the exhibitions were being connected to the galleries correctly.

One problem I faced was that the migration would often fail with the error MySQL has gone away - with some help from drupal.org I learned that this wouldn't be so bad if the tables use InnoDB. Converting one of the suggestions into a quick script to update all the Drupal 6 tables really helped, although copying the my.cnf settings killed my MySQL completely for some reason. Yet another reminder to keep backups when you're changing things.

Having read some tutorials, and done some migrations in Drupal 6 and 7, I was trying to tweak the data inside the prepareRow method in my custom migration class. The thing I didn't get for ages was that this method is provided by the Migrate Plus module, so not only did the module have to be enabled, but the migration definition yml file names needed to start with migrate_plus.migration rather than the migrate.migration.

Once I'd made that change, the prepareRow method fired as expected, and from there it was relatively straightforward to get the values out of the old database, even in the more complex migrations like getting location data from another table and splitting it into two fields.

As an example, here's the code of the prepareRow method in the GalleryNode migration class:


/**
 * {@inheritdoc}
 */
public function prepareRow(Row $row) {
  if (parent::prepareRow($row) === FALSE) {
    return FALSE;
  }

  // Make sure that URLs have a protocol.
  $website = $row->getSourceProperty('field_website');
  if (!empty($website)) {
    $url = $website[0]['url'];
    $website[0]['url'] = _gallerymigrations_website_protocol($url);
    $row->setSourceProperty('field_website', $website);
  }

  // Get the location data from the D6 database.
  $nid = $row->getSourceProperty('nid');
  $location = $this->getLocation($nid);

  // Set up latitude and longitude for use with geolocation module.
  $geolocation = $this->prepareGeoLocation($location->latitude, $location->longitude);
  $row->setSourceProperty('field_location', $geolocation);

  $address = $this->prepareAddress($location);
  $row->setSourceProperty('field_address', $address);

  return parent::prepareRow($row);
}

The methods called by this are all fairly similar, with a switch to the D6 database followed by a query - here's an example:


/**
 * Get the location for this node from the D6 database.
 *
 * @param int $nid
 *   The node ID of the gallery.
 *
 * @return Object
 *   The database row for the location.
 */
protected function getLocation($nid) {
  // Switch connection to access the D6 database.
  \Drupal\Core\Database\Database::setActiveConnection('d6');
  $db = \Drupal\Core\Database\Database::getConnection();

  $query = $db->select('location_instance', 'li');
  $query->join('location', 'l', 'l.lid = li.lid');
  $query->condition('nid', $nid);
  $query->fields('l', array(
    'name',
    'street',
    'additional',
    'city',
    'province',
    'postal_code',
    'country',
    'latitude',
    'longitude',
    'source',
  ));

  $result = $query->execute();

  // Revert to the default database connection.
  \Drupal\Core\Database\Database::setActiveConnection();

  $data = array();
  foreach ($result as $row) {
    $data[] = $row;
  }

  // There should be only one row, so return that.
  return $data[0];
}

I get the feeling that if I was following the "proper" object-oriented approach, I'd be doing this using a process plugin, as suggested by this tutorial from Advomatic. But this does the job, and the code doesn't feel all that dirty.

Another lesson I learned the hard way is that when you're adding fields from other sources inside the prepareRow method, you also need to remember to add those fields into the .yml file.

Feeling pleased with myself that I'd managed to migrate the location data, I decided to jump down the rabbit hole of working on integration between Geolocation and Address modules, even though I'd already said I didn't need to do it. Why do developers do that? I can see how difficult a project manager's job can be sometimes. Thankfully, the integration (at least for the needs of this site) can be a fairly shallow and simple job with a few lines of JavaScript, so I've put a patch up for review.

In my day job, I'm a great believer in breaking tasks down as far as possible so that code can be reviewed in small branches and small commits. But when you're working on your own project, it's easy to jump around from task to task as the mood takes you. You can't be bothered with creating branches for every ticket - after all, who's going to review your code?. Half the time, you can't even be bothered creating tickets - you're the product owner, and the backlog is largely in your head.

That butterfly tendency, plus the number of patches I'm applying to core and contributed modules, means that my local site has far more uncommitted change than I'd normally be comfortable with. Using git change lists in PhpStorm has really helped me to cope with the chaos.

On the subject of patches, I've finally got round to trying out Dave Reid's patch tool - it's working really well so far.

This process has reinforced in my mind the value of testing things like migrations on a small sample set. Thankfully, the Drupal 6 version of the Admin Views module lets you bulk delete nodes and taxonomy terms - I couldn't face tweaking the migration while running multiple iterations of importing 3828 terms.

Which reminds me, xdebug is great, but remember to disable it after you've finished with it, otherwise using the site in your VM will be slow, and as Joel Spolsky says, when things run slowly, you get distracted and your productivity suffers. Humans are not good at multitasking, especially when those tasks are complex and unfamiliar.

And when we try to multitask, we don't think straight. I've just spent an hour debugging something that should just work, because the logic in my taxonomy term source plugin was based on a piece of confusion that now seems obvious and stupid. For reference, in Drupal 6, the 'term_node' table connects nodes with the taxonomy terms they're tagged with, and vid refers to the node revision ID, whereas the 'taxonomynode' table connects terms with their related taxonomy node, and vid refers to the vocabulary ID.

The bad news is that the mappings from nodes to taxonomy terms aren't being migrated properly - for some strange reason they're being registered correctly, but all the rows are being ignored.

The good news is that the work in progress is now online for the world to see. For one thing, it's easier to do cross-browser testing that way, rather than faffing around with virtual machines and proxy tunnels and all that sort of nonsense.

So please, have a look, and if you spot any bugs, let me know by creating an issue on the project board.

Tags:  Drupal Drupal 8 The Gallery Guide All tags

Drupal Console by the numbers

Posted by Drupal Console on May 3, 2016 at 8:22am
In this blog post, we will explore some interesting numbers related to the development of this project, between the first commit at Aug 28, 2013, and the day of writing this blog post, May 3, 2016. Keep reading to find how much time had been invested in development, how many tagged released we have done, the number of awesome contributors and number of downloads for this project between others.

Sprint with us on Commerce 2.x at DrupalCon New Orleans

Posted by Commerce Guys on May 3, 2016 at 4:10am

Three months ago Commerce Guys separated from Platform.sh to refocus the business around Drupal Commerce. Even as a three-person team (we're now four - welcome, Doug!), we worked hard to dedicate Bojan completely to Commerce 2.x in anticipation of DrupalCon New Orleans. As I discussed in the most recent DrupalEasy podcast, this resulted in great progress both for Commerce 2.x and for Drupal 8 itself. (It also kept us near the top of the most prolific contributors to Drupal. : )

While we're preparing to present the latest in Drupal Commerce in our session at 10:45 AM on Thursday, we're also getting ready to sprint on Commerce 2.x the full week from our booth. This will be our first opportunity to jam on code as a full team since our spinout, and we'd love to have you join us.

Look for us near the permanent coffee station (intentional) beside Platform.sh and Acro Media, our delivery affiliate in the U.S. whose partnership and vision for enterprise Drupal Commerce have been invaluable as we've rebooted our business.

If you'd like to get up to speed on the current status of development, we recommend the following resources:

Naturally, we're happy to help anyone begin to contribute to Drupal 8 / Commerce 2.x. Bojan has mastered the art of integrating developers from a variety of agencies of all skill levels into the development process so far. For an espresso or a Vieux Carré, he may even train you, too. ; )

FYI: Not going to DrupalCon New Orleans 2016

Posted by Doug Vann on May 3, 2016 at 12:13am

As DrupalCon New Orleans gets closer, I'm getting asked more frequently if I'm going. I'm honored to have so many hit me up and ask my directly via Twitter, LinkedIn, Skype, Google HangOut, etc.

The answer is NO. And here's why...
Business is going super-dee-duper well and I can barely breathe. I haven't mastered the art of saying NO to new gigs, so that leaves me stretched thin and working long hours. Disappearing for a week would NOT fit well into that scenario. Going to NOLA would cost me not only the $2.5K to $3K of the event, but almost that much again in lost billables. If I don't do billable things, I can't send invoices. The work I would miss could not be made up later. Those dead hours would never see the light of day again.


Is there an ROI on these Drupal trips?
I have attended 7 North American DrupalCons from 2008 to 2014. It cost a lot of money, but I am absolutely and thoroughly convinced that the ROI is incalculable. Serously... Many of the relationships I have today within the Drupal community can be traced to a DrupalCon wether it be in a session, in the hallway, in the exhibit  hall, in the hotel lobby, or at any of the numerous partys. There is no doubt in my mind that I wouldn't have the thriving business I have today had I not gone to 7 DrupalCons and many other events. In the Drupal Community, personal relationships often lead to business relationships either firectly or via referal!
I missed last year [and blogged about it] because my wife and I were closing on our first house purchase and it was taking longer than anticipated. Not to mention I was also swamped in work at the time. 

See you next year?
That is very possible. I hate the idea of missing a 3rd PreNote! And I definitely miss seeing all my friends and engaging the amazing networking opportunities. We'll see. :-)

Drupal Planet

View the discussion thread.

Pages

Subscribe with RSS Subscribe to Drupal.org aggregator - Planet Drupal