789
I Use This!
Very High Activity

News

Analyzed 1 day ago. based on code collected 2 days ago.
Posted over 5 years ago
This blog has been re-posted and edited with permission from Dries Buytaert's blog. Please leave your comments on the original post. After months of hard work, the Drupal Governance Task Force made thirteen recommendations for how to evolve Drupal's ... [More] governance. Drupal exists because of its community. What started from humble beginnings has grown into one of the largest Open Source communities in the world. This is due to the collective effort of thousands of community members. What distinguishes Drupal from other open source projects is both the size and diversity of our community, and the many ways in which thousands of contributors and organizations give back. It's a community I'm very proud to be a part of. Without the Drupal community, the Drupal project wouldn't be where it is today and perhaps would even cease to exist. That is why we are always investing in our community and why we constantly evolve how we work with one another. The last time we made significant changes to Drupal's governance was over five years ago when we launched a variety of working groups. Five years is a long time. The time had come to take a step back and to look at Drupal's governance with fresh eyes. Throughout 2017, we did a lot of listening. We organized both in-person and virtual roundtables to gather feedback on how we can improve our community governance. This led me to invest a lot of time and effort in documenting Drupal's Values and Principles. In 2018, we transitioned from listening to planning. Earlier this year, I chartered the Drupal Governance Task Force. The goal of the task force was to draft a set of recommendations for how to evolve and strengthen Drupal's governance based on all of the feedback we received. Last week, after months of work and community collaboration, the task force shared thirteen recommendations (PDF). Me reviewing the Drupal Governance proposal on a recent trip. Before any of us jump to action, the Drupal Governance Task Force recommended a thirty-day, open commentary period to give community members time to read the proposal and to provide more feedback. After the thirty-day commentary period, I will work with the community, various stakeholders, and the Drupal Association to see how we can move these recommendations forward. During the thirty-day open commentary period, you can then get involved by collaborating and responding to each of the individual recommendations below: Create a Community Governance Group Improve collaboration between the Drupal Association and the community Clarify and expand local Drupal Associations Grow the Community Working (CWG) group to offer more support Create a Community Strategic Plan Expand onboarding and mentoring to increase contributor pipeline Provide greater support for in-person events Build a new community website to centralize communication and promote opportunities Create community training offerings to develop leadership skills Define key community terms in clear, translatable language Create a Drupal Community Diversity Statement Improve definitions of representation, leadership, and the expected higher standards Establish processes for handling conflict of interests I'm impressed by the thought and care that went into writing the recommendations, and I'm excited to help move them forward. Some of the recommendations are not new and are ideas that either the Drupal Association, myself or others have been working on, but that none of us have been able to move forward without a significant amount of funding or collaboration. I hope that 2019 will be a year of organizing and finding resources that allow us to take action and implement a number of the recommendations. I'm convinced we can make valuable progress. I want to thank everyone who has participated in this process. This includes community members who shared information and insight, facilitated conversations around governance, were interviewed by the task force, and supported the task force's efforts. Special thanks to all the members of the task force who worked on this with great care and determination for six straight months: Adam Bergstein, Lyndsey Jackson, Ela Meier, Stella Power, Rachel Lawson, David Hernandez and Hussain Abbas. [Less]
Posted over 5 years ago
Start, stop or stop Apache web server from the terminal on Mac OS to make your life easier.
Posted over 5 years ago
Stay Connected MidCamp never stops (although we do take frequent coffee breaks), so make sure you stay connected. ... [More] We’re on the MidCamp Slack year-round to discuss the event, Drupal, jobs or other goings-on in the Chicagoland area. You can also meet up with us IRL by attending Drupal Chicago Meetup. Watch all of the MidCamp sessions from our previous years on our YouTube channel. As always, be sure to follow us on Twitter, LinkedIn, and Instagram. For the most up-to-date information about all things, MidCamp subscribe to our newsletter [Less]
Posted over 5 years ago
How to Create and Manage a Content Workflow in Drupal 8: Either a Standard or a Custom One adriana.cacoveanu Wed, 11/14/2018 - 14:01 "A Drupal 8 initiative to improve Drupal's content workflow", this is how Dries ... [More] Buytaert first defined the Workflow Initiative, back in 2016. Now, coming back to 2018, you must be asking yourself a legitimate question: “How do I set up a content workflow in Drupal 8?” “How do I manage, extend and customize an editorial workflow to fit my Drupal 8 website's publishing needs? One including multiple users, with different permissions, that manages the workflow status of... different content types.” [Less]
Posted over 5 years ago
Firas Ghunaim November 14, 2018 The role of marketers has evolved ... [More] beyond simply managing and coordinating online advertising campaigns to drive traffic to their respective websites; successful marketers must transform themselves to become storytellers. Storytelling is an essential skill to master the art of crafting a digital experience for your digital business would-be customers and users.  Therefore, as a digital business, you must be strategic when it comes to choosing the appropriate platform for your digital experience.  The fact that you can easily publish and manage multimedia content across multiple websites under one brand has made Drupal a popular and strategic platform for enterprises, the media, healthcare and even powering digital governments. In short; Drupal is a dream come true for marketers working in the aforementioned industries and sectors.   Here are 5 Drupal modules that were created by the Drupal community for marketers:     1. HubSpot HubSpot is a widely used and popular inbound marketing software platform that helps companies attract visitors, convert leads, and close customers.   Lead generation is a carefully planned ongoing process; deliberate in targeting users based on personalized content marketing. HubSpot CRM is relied upon by marketers to enable them to qualify the leads generated from the landing pages they developed on their websites. Marketers that use Drupal-based websites have saved a lot of time and effort by connecting their HubSpot CRM with their web forms that capture the desired user data. For example, a Webform-based contact form on your site can send its data to HubSpot, where you may already track potential clients and contacts; or a Webform-based e-newsletter signup could send the lead to HubSpot's targeted marketing system, letting you use your pre-existing email campaigns. Moreover, marketers that create content on HubSpot, can easily display it in Drupal 8’s front-end. Contact us to integrate HubSpot CRM into your Drupal website seamlessly     2. Webform With almost 5,000,000 downloads and nearly 500,000 websites using this module; this makes Webform one of the most popular Drupal modules out there.   Forms are an essential feature of a digital experience that relies upon gathering user data relevant to content marketing and personalizing user experiences across all relevant digital platforms. Webform enables integration with various 3rd party marketing solutions such as MailChimp, HubSpot, and Salesforce to name but a few. You can find a comprehensive list of Webform add-ons here. A great and simple guide to how you can get started on form building using Webform is available here courtesy of OSTraining.     3. Google Analytics This Drupal module adds the Google Analytics web statistics tracking system to your website.   According to Builtwith.com, Google Analytics is the most popular analytics tool in the world with at least 37 million live websites currently using the giant tracking and performance monitoring platform. Marketers that use Drupal website benefit from the Google Analytics module to identify their traffic size, traffic sources and track the performance of their website with regards to ongoing personalization of their user experience. Being able to seamlessly integrate Google Analytics also provides marketers with real-time data for current site usage and user behavior. In addition to the aforementioned; marketers can track almost every statistic imaginable such as User ID, domain, as well as how many and which files were downloaded and by whom. For full details regarding the statistics that marketers can track, visit the official Drupal module webpage for Google Analytics here.   4. MailChimp This module provides integration with MailChimp, a popular email delivery service.   Regularly maintained and reliable, MailChimp is a favorite tool amongst email marketers. That is mainly due to the attention to both sides of the email exchanges taking place. This Drupal module allows email marketers to create and send email marketing campaigns from your website and analyze the performance of the campaign while monitoring the behavior of users interacting with your emails. On the other hand, your website visitors can choose to subscribe (or unsubscribe) easily to the email list of their preference. MailChimp is not limited to email delivery only; you can also integrate MailChimp E-Commerce which allows marketers to optimize their online store sales via personalized email marketing campaigns and automation workflows. 7 Drupal Modules That Every E-Commerce Must Have     5. Crazy Egg This module provides integration with the Crazy Egg heat map service.     Much of digital business and marketing success comes down to the ongoing enhancement of your digital experience. This requires a consistent effort of monitoring feedback from your users who in the end must be able to enjoy an engaging user experience that doesn't feature frustrating issues such as slow page load speed and irrelevant content. The aforementioned frustrations can heavily impact your search engine optimization (SEO) efforts to rank higher on search engines like Google; which makes monitoring online user behavior on-site all the more imperative for marketers. 10 SEO Modules That Every Website Must Have Crazy Egg is a simple Drupal module that is easy to install to your Drupal 8 (or D7) website to gain access to various reporting formats that showcase online behavior on your web pages. By recording the user behavior, marketers are able to gain a visual insight into how users interact with different elements, features, and components of their website. Marketers are able to understand where users face challenges browsing the site, which aspects of the website they spend most of their time on and which they avoid completely. Ultimately, the feedback gained allows marketers to develop the best UI, UX, and content in a more informed manner.     Bonus: Varbase SEO This is a core Varbase feature. We strongly recommend Varbase as the ultimate starter kit and distribution to build your Drupal digital experiences.     Enterprise level organizations and governments that rely upon an ongoing content marketing process require a content publishing and management solution that can handle heavy traffic without compromising performance standards. This is where Varbase saves the day. Not only is Varbase inherently optimized for all search engines it also enables you as a content marketer to optimize your multilingual content regardless of the media format to a diverse and global target audience. Optimized markup that is compliant and accessible to WCAG 2.0 Level AA standards XML Sitemap that is also language aware Content SEO grader and recommendations Full-suite of meta tags and descriptive tags that makes your site more optimized, integrated and favorable to search engines such as Google, Yandex, and Bing, and social media networks such as Facebook, Twitter, and many other Total control over how your site will look when appearing on search results Handle redirects with ease and prevent dead links Readable and SEO-friendly URLs that automatically reads your site's structure and hierarchy   You can view our work on digital experiences that were built using Varbase here.     Honorable Mentions:   Honeypot One of the popular Drupal modules available out there. Used by marketers and Drupal platforms that wish to avoid spam. Honeypot keeps your database clean by blocking spambots from using your web forms using both the honeypot and timestamp methods and is not as intrusive as CAPTCHAs.   5 Security Modules Every Drupal Website Must Have   Accelerated Mobile Pages (AMP) The AMP module is designed to convert Drupal pages into pages that comply with the AMP standard.  AMP is important because it helps web pages load faster which potentially improves usability and convinces visitors to stay longer on your site engaging with your content. The logic is straightforward: faster load time leads to better engagement, which reduces bounce rate and improves mobile ranking.   If we were to feature every Drupal module out there the list would be way too long, so which Drupal modules do you prefer? How does it help you achieve your marketing goals? Share with us your own preferences and we will surely feature them as soon as possible.   [Less]
Posted over 5 years ago
Disseminating Knowledge: Drupal for Education and E-learning Shankar Wed, 11/14/2018 - 12:26 Have you always secretly wanted to spend your evenings writing symphonies, learning about filmography or assessing climate change? Studying ... [More] niche subjects have traditionally been for niche students. But e-learning platforms have changed all that with the provision for learning almost any subject online. Corporate e-learning has witnessed a stupendous 900% growth in the last decade or so. With more and more e-learning platforms flourishing, organisations are striving to be the best to stand apart from the rest. Drupal has been a great asset in powering education and e-learning with its powerful capabilities that can help enterprises offer a wonderful digital experience. Let’s trace the roots of e-learning before diving deep into the ocean of possibilities with Drupal for building an amazing e-learning platform. Before the internet era Source: eFrontA brief history of e-learning can be traced through the compilation made by eFront. Even before the internet existed, distance education was being offered. In 1840, Isaac Pitman taught shorthand via correspondence where completed assignments were sent to him via mail and he would, then, send his students more work. Fast forward to the 20th century, the first testing machine was invented in 1924 that enabled students to test themselves. The teaching machine was invented in 1954 by a Harvard professor for allowing schools to administer programmed instruction to students. In 1960, the first computer-based training program (CBT program) called Programmed Logic for Automated Teaching Operation (PLATO). At a CBT systems seminar in 1999, the term ‘e-learning’ was first utilised. Eventually, with internet and computers becoming the core of businesses, the 2000s saw the adoption of e-learning by organisations to train employees. Today, a plenitude of e-learning solutions are available in the form of MOOCs (Massive Open Online Courses), Social platforms and Learning Management System among others. E-learning: Learn anywhere, anytime In essence, e-learning refers to the computer-based educational tool or system that allows you to learn anywhere and at any time. It is the online method of building skills and knowledge across the complete workforce and with customers and partners. It comes with numerous formats like the self-paced courses, virtual live classrooms or informal learning. E-learning refers to the computer-based educational tool or system that allows you to learn anywhere and at any time Technological advancements have diminished the geographical gap with the use of tools that can make you feel as if you are inside the classroom. E-learning provides the ability to share material in all sorts of formats such as videos, slideshows, and PDFs. It is possible to conduct webinars (live online classes) and communicate with professors via chat and message forums. There is a superabundance of different e-learning systems (otherwise known as Learning Management Systems or LMS) and methods which enable the courses to be delivered. With the right kind of tools, several processes can be automated like the marking of tests or the creation of engrossing content. E-learning offers the learners with the ability to fit learning around their lifestyles thereby enabling even the busiest of persons to further a career and gain new qualifications. Merits and Demerits Some of the major benefits are outlined below: No restrictions: E-learning facilitates learning without having to organise when and where everyone, who is interested in learning a course, can be present. Interactive and fun: Designing a course to make it interactive and fun with the use of multimedia or gamification enhances engagement and the relative lifetime of the course. Affordable: E-learning is cost-effective. For instance, while textbooks can become obsolete, the need to perpetually acquire new editions by paying exorbitant amounts of money is not present in e-learning. Some of the concerns that need to be taken care of: Practical skills: It is considered tougher to pick up skills like building a wooden table, pottery, and car engineering from online resources as these require hands-on experience. Secludedness: Although e-learning enables a person to remotely access a classroom in his or her own time, learners may feel a sense of isolation. Tools such as video conferencing, social media and discussion forums can allow them to actively engage with professors or other students. Health concerns: With the mandatory need of a computer or mobile devices, health-related issues like eyestrain, bad posture, and other physical problems may be troublesome. However, sending out proper guidelines beforehand to the learner like correct sitting posture, desk height, and recommendations for regular breaks can be done. Building Yardstick LMS with Drupal OpenSense Labs built Yardstick LMS, a learning management system, for Yardstick Educational Initiatives which caters to the students of various schools of Dubai. Yardstick LMS HomepageThe architecture of the project involved a lot of custom development: 1. Yardstick Core This is the core module of the Yardstick LMS where the process of creating, updating and deleting the nodes take place. 2. Yardstick Quiz We built this custom module for the whole functionality of the quiz component. It generates a quiz, quiz palette and quiz report after quiz completion based upon the validation of the visibility of the report. We could generate three kinds of reports:  An individual-level quiz where one’s performance is evaluated A sectional-level report where performance for each section is evaluated Grade-level report where performance for all the sections is compared and evaluated. For the quiz, we had different sub-components like questions, options, marks, the average time to answer, learning objective, skill level score, and concept. The same question could be used for different quiz thereby minimising the redundancy of the data. Also, image, video or text could be added for questions. 3. Yardstick Bulk User Import This module was built to assist the administrators in creating users all at once by importing a CSV file. Also, there is an option to send invitation mail to all the users with login credentials. 4. Yardstick Custom Login We provided a custom login feature where same login credentials could be used to log into the Yardstick system. That is, we provided an endpoint for verifying the login credentials and upon success, users were logged in. 5. Yardstick Validation This module offers all the validation across the site whether it is related to access permission or some time validation. 6. Yardstick Challenge It offers the user an option to submit a task which is assigned to them where they are provided with text area and file upload widget. Yardstick LMS has an intricate structure On the end user side, there is a seamless flow but as we go deeper, it becomes challenging. Yardstick LMS has an intricate structure. We had two kinds of login: Normal login using Yardstick credentials And the other for school-specific login like the Delhi Public School (DPS) users. Yardstick LMS custom login for DPS usersFor DPS users, we used the same login form but a different functionality for validating credentials. DPS school gave us an endpoint where we sent a POST request with username and password. If the username and password were correct, then that endpoint returned the user information. If the username was received, we checked on our Yardstick system if the username exists. If it does not exist, then we programmatically created a new user with the information that we received from the endpoint and created a user session. And if does exist, then we updated the password on our system. Yardstick LMS is designed to govern multiple schools at the same time We designed Yardstick LMS in such a way that multiple schools can be governed at the same time. All the students of various schools will be learning the same content thereby building uniformity. The core part of our system dwells in the modules. The module is a content type that can store numerous information like components, concept, description, objective, syllabus among others.  Several different components can be added like Task, Quiz, Video task, Extension, Feedback, Inspiration, pdf lesson plan, Real life application, and Scientific principles. Yardstick LMS Real life application component pageSchools could opt for different modules for different grades. When a module was subscribed by a school, a clone module of the master module was created and the school copy was visible only to the school. School version could be modified by the school admin as per their needs and preferences. Master module remained the same. While creating a subscription, administrator had to provide the date so that the components were accessible to the students. School admin could set different dates to different components and only the components with past date were accessible. Flow Diagram of module subscription to schoolAlso, we provided an option to create a dynamic feedback form for the modules for analysis. Yardstick Admin had the option to design and create a feedback form as per their requirement and could assign it to a particular module. Different types of elements could be utilised for designing the form like rating, captcha, email, range slider, text field, checkboxes, radio buttons and so on. Students and teachers need to submit their feedback for each of the modules. On the basis of this, Yardstick team try to improve the content of the system. Also, various roles were defined for users such as Yardstick Administrator, School Administrator, Teacher, and Student. 1. Yardstick Admin Yardstick Admin can perform all the operations. He or she can create new users, grant permissions and revoke them as well. 2. School Admin It has the provision for handling all the operation which are only related to their school. School Admin handles the modules and their components and can import user for their school. All school reports and task submissions are visible to School Admins. 3. Teachers Teachers can view modules and components assigned to their classes and provide remarks to the students for multiple components and they can view all kinds of reports. 4. Students They can attempt quiz, submit tasks, view components and view their own reports. What’s the future of e-learning? According to a report on Research and Markets, the e-learning market is anticipated to generate revenue of $65.41 billion by 2023 with a growth rate of 7.07% during the forecast period. The report goes on to state that with the advent of cloud infrastructure, peer-to-peer problem solving and open content creation, more business opportunities would pop up for service providers in the global e-learning market. The introduction of cloud-based learning and AR/VR mobile-based learning will be a major factor in driving the growth of e-learning. The growth of the e-learning market is due to the learning process enhancements in the academic sector According to Technavio, the growth of the market is due to the learning process enhancements in the academic sector. Global self-paced e-learning market 2019-2023 | Source: TechnavioFollowing are major trends to look forward to: Microlearning, which emphasises on the design of microlearning activities through micro-steps in digital media environments, will be on the rise. Gamification, which is the use of game thinking and game mechanics in a non-game context to keep the users engrossed and help them solve more problems, will see increased adoption rates. Personalised learning, which is the tailoring of pedagogy, curriculum and learning environments to meet the demands of learners, can be a driving force. Automatic learning, like the one shown in the movie The Matrix where a person is strapped onto a high-tech chair and a series of martial arts training programs are downloaded into his brain, can be a possibility. Conclusion It’s a world which is replete with possibilities. As one of the most intelligent species to walk on this earth, we perpetually innovate with the way we want to lead a better lifestyle. We learn new things to gain more knowledge. And in the process, we find ways of improving our learning experience. E-learning is one such tech marvel that promises to be a force to reckon with. It is not a disrupting technology but something that is going to get bigger and bigger in the years to come. As a content management framework, Drupal offers a magnificent platform to build a robust e-learning system. With years of experience in Drupal Development, OpenSense Labs can help in providing an amazing digital experience.  Contact us at [email protected] to build an e-learning system using Drupal and transform the educational experience. blog banner blog image E-learning Drupal e-learning Drupal and education Yardstick LMS Drupal Learning Management System Drupal LMS LMS Learning Management System E-learning platform E-learning system E-learning application Blog Type Articles Is it a good read ? On [Less]
Posted over 5 years ago
Drupal and Composer - an In-Depth Look   As any developer working with Drupal 8 knows, working with Composer has become an integral part of working with Drupal. This can be daunting for those without previous experience working with command line ... [More] , and can still be a confusing experience for those who do. This is the first post in an explorative series of blog posts I will be writing on Composer, hopefully clearing up some of the confusion around it. The four blog posts on this topic will be as follows: Part 1: Understanding Composer Part 2: Managing a Drupal 8 site with Composer (Coming Soon) Part 3: Converting Management of an Existing Drupal 8 Site to Composer (Coming Soon) Part 4: Composer for Drupal Developers (Coming Soon) So without further ado, let’s get started. Composer: What is it? The Wikipedia page (https://en.wikipedia.org/wiki/Composer_(software)) describes Composer as follows: Composer is an application-level package manager for the PHP programming language that provides a standard format for managing dependencies of PHP software and required libraries. That’s an accurate description, though a little wordy. So let’s break it down a little further to understand what it means. Programmers like to use the term DRY - Don’t Repeat Yourself. This means that whenever possible, code should be re-used, rather than re-written. Traditionally, this referred to code within the codebase of a single application, but with Composer, code can now be shared between applications as well. DRY is another way of saying don’t re-invent the wheel; if someone else has already written code that does what you want to do, rather than writing code that does the same thing, it’s better to re-use the code that that has already been written. For example, the current standard for authentication (aka logging in) to remote systems is the OAuth 2 protocol. This is a secure protocol that allows sites or applications to authenticate with other sites, such as Facebook, Google, Twitter, Instagram, and countless others. Writing OAuth 2 integrations is tricky, as the authentication process is somewhat complex. However, other developers have written code that handles OAuth 2 integration, and they have released this code on the internet in the form of a library. A library is basically a set of code that can be re-used by other sites. Using Composer, developers can include this library in a project, and use it to authenticate to the remote API, saving the developer from having to write that code. Composer allows developers to do the following: Download and include a library into a project, with a single command Download and include any libraries that library is dependent upon Check that system requirements are met before installing the library Ensure there are no version conflicts between libraries Update the library and its dependencies with a single command  So how does Composer work? Composer itself is a software/program. After a user has installed Composer, they can then say ‘Composer: download Library A to my system’. Composer searches remote repositories for libraries. A repository is a server that provides a collection of libraries for download. When Composer finds Library A in a repository, it downloads the library, as well as any libraries that Library A is dependent upon. A note on terminology In this article, the term Library is used. Libraries are also known as Packages, and referred to as such on https://getcomposer.org/ A project is the codebase, generally for a website or application, that is being managed by Composer.  By default, the main repository Composer looks at is https://packagist.org/. This is a site that has been set up specifically for Composer, and contains thousands of public libraries that developers have provided for use. When a user says ‘Composer download Library A’, the Composer program looks for Library A on https://packagist.org/, the main public Composer repository, and if it finds the Library, it downloads it to your system. If Library A depends upon (aka requires) Library B, then it will also download Library B to your system, and so on. It also checks to make sure that your system has the minimum requirements to handle both Library A and Library B and any other dependencies, and also checks if either of these packages have any conflicts with any other libraries you've installed. If any conflicts are found, Composer shows an error and will not install the libraries until the conflicts have been resolved. While packagist.org is the default repository Composer searches, projects can also define custom repositories that Composer will search for libraries. For example, many developers use Github or Bitbucket, popular services that provide code storage, to store their code in the cloud. A project owner can set up Composer to look for projects in their private Github, Bitbucket, or other repositories, and download libraries from these repositories. This allows for both the public and private code of a project to be managed using Composer. What happens when I install a library? Composer manages projects on a technical level using two files: compser.json and composer.lock.  First we’ll look at the composer.json file. This file describes the project. If a developer is using private repositories, the repositories will be declared in this file. Any libraries that the project depends on are written in this file. This file can also be used to set specific folder locations into which libraries should be installed, or set up scripts that are executed as part of the Composer install process. It’s the outline of the entire project. Each library has a name. The name is combined of two parts, first a namespace, which is an arbitrary string that can be anything but is often a company name, or a Github user name etc. The second part is the library name. The two parts are separated by a forward slash, and contain only lower case letters. Drupal modules are all part of the drupal namespace. Libraries are installed using Composer’s require command. Drupal modules can be installed with commands like: // Drupal core. composer require drupal/core // Drupal module. composer require drupal/rules // Drupal theme. composer require drupal/bootstrap When the above commands are run, Composer downloads the library and its dependencies, and adds the library to the composer.json file to indicate that your project uses the library. This means that composer.json is essentially a metadata file describing the codebase of your project, where to get that code, and how to assemble it. Composer and GIT, Multiple Environments and Multiple Developers Composer and GIT work really well with each other. To understand how, let’s first look at traditional site management using GIT. Developer A is creating a new Drupal project, purely managed with GIT: Developer A downloads Drupal core Developer A creates a new GIT repository for the code they have downloaded, and commits the code to the repository Developer A pushes the code to a central repository (often Github or Bitbucket) Developer A checks out (aka pulls) the code to this server. This all sounds good, and it actually works very well. Now let’s imagine that Developer B comes onto the project. Developer B uses GIT to download the code from the central repository. At this point, the codebase in GIT exists in four locations: Developer A’s computer Developer B’s computer The central repository The production server  At the moment, the codebase only consists of Drupal core. The Drupal core code is being managed through GIT, which would allow for changes to be tracked in the code, yet it’s very unlikely that either Developer A or Developer B, or indeed any other developers that come on the project, will actually ever edit any of these Drupal core files, as it is a bad practice to edit Drupal core. Drupal core only needs to be tracked by developers who are developing Drupal core, not by projects that are simply using it. So the above setup results in sharing and tracking a bunch of code that is already shared and tracked somewhere else (on Drupal.org). Let’s look at how to start and use Composer to manage a project. Note that this is NOT the best way to use Composer to manage a Drupal site, and is simply an example to show how to use Composer (see part 2 of this series for specifics on how to use Composer to manage a Drupal site). Developer A creates a new project folder and navigates into it. Developer A initializes the project with composer init, which creates a composer.json file in the project folder Developer A adds the Drupal repository at https://packages.drupal.org/8 to composer.json, so that Drupal core, modules and themes can be installed using Composer Developer A runs composer require drupal/core, which installs Drupal core to the system, as well as any dependencies. It also creates composer.lock (which we'll look at further down the article) Developer A creates a new GIT repository, and adds composer.json and composer.lock to the GIT repository Developer A pushes composer.json and composer.lock to the central repository Developer A sets up the production server, and checks out the code to this server. At this point, the code consists only of the composer.json and composer.lock files. Additional servers can be set up by checking out the code to any server. Developer A runs composer install on the production server. This pulls all the requirements and dependencies for the project as they are defined in composer.json Now when Developer B comes on the project, Developer B uses GIT to download the codebase to their local computer. This codebase contains only composer.json and composer.lock. However, when they run composer install they will end up with the exact same codebase as the production server and on Developer A’s machine. Now the codebase exists in the same four locations, however the only code being tracked in the GIT repository is the two files used to define the Composer managed project. When an updated is made to the project, it is handled by running composer update drupal/core, which will update both composer.json and composer.lock. These files are then updated in the GIT repository, as they are the files specific to our project. The difference between the traditional GIT method, and the above method using Composer, is that now Drupal core is considered to be an external library, and is not taking up space unnecessarily in our project's GIT repository. Project Versions Projects can, and pretty much always do, have versions. Drupal 8 uses semantic versioning, meaning that it goes through versions 8.1, 8.2, 8.3… and so on. At the time of writing the current version is 8.6.3. If a new security fix is released, it will be 8.6.4. In time, 8.7.0 will be released.  Composer allows us to work with different versions of libraries. This is a good thing, however it opens up the risk of developers on a project working with different versions of a library, which in turn opens up possibility of bugs. Composer fortunately is built to deal with versions, as we will look at next. Tracking Project Versions So how does Composer handle versions, allowing developers to ensure they are always using the same library versions? Welcome the composer.lock file. The composer.lock file essentially acts as a snapshot of the all the versions of all the libraries managed by composer.json. Again, I’ll refer back to the Composer managed site described above. When we first run composer require drupal/core in our project, a few things happen: The current (most recent) version of Drupal is downloaded to the system All libraries that Drupal depends on are also downloaded to the system composer.json is updated to show that Drupal is now a dependency of your project composer.lock is created/updated to reflect the current versions of all Composer managed libraries So composer.json tracks which libraries are used, and composer.lock is a snapshot tracking which versions of those libraries are currently being used on the project.  Synchronizing Project Versions The problem with developers using different versions of libraries is that developers may write code that only works on the version of the library that they have, and other developers either don’t yet have, or maybe they are using an outdated version of the library and other developers have updated. Composer projects manage library versions using the commands composer install and composer update. These commands do different things, so next we'll look at the differences between them. Composer Install and Composer Update Imagine that Composer didn’t track versions. The following situation would happen (again, this is NOT how it actually works): Drupal 8.5.6 is released. Developer A creates a new project, and sets Drupal core as dependency in composer.json. Developer A has Drupal 8.5.6 Drupal 8.6.0 is released Developer B clones the GIT project, and installs the codebase using composer install. Composer downloads Drupal core. Developer B has Drupal 8.6.0 The two developers are now working on different versions of Drupal. This is dangerous, as any code they write/add may not be compatible with each other's code. Fortunately Composer can track libraries. When a user runs composer install, the versions defined in composer.lock are installed. So when Developer B runs composer install, Drupal 8.5.6 is installed, even though Drupal 8.6.0 has been released, because 8.5.6 is listed as the version being used by the project in composer.json. As such, developers working on Composer managed projects should run composer install each time they pull updates from remote GIT repositories containing Composer managed projects. Updating versions As has been discussed, the composer.lock file tracks the versions of libraries currently used on the project. This is where the composer update command comes in. Let’s review how to manage version changes for a given library (this is how it actually works): Drupal 8.5.6 is released. Developer A creates a new project, and sets Drupal core as dependency. The composer.lock file records the version of Drupal core used by the project as 8.5.6. Drupal 8.6.0 is released Developer B clones the GIT project, and installs the codebase using composer install. The composer.lock file lists the version of Drupal core being used on the project as 8.5.6, so it downloads that version. Developer A sees that a new version of Drupal has been released. Developer A runs composer update drupal/core. Composer installs Drupal 8.6.0 to their system, and updates composer.lock to show the version of Drupal core in use as 8.6.0. Developer A commits this updated composer.lock to GIT, and pushes it to the remote repository.  Developer B pulls the GIT repository, and gets the updated composer.lock file. Developer B then runs composer install, and since the version of Drupal core in registered as being used is now 8.6.0, Composer updates the code to Drupal 8.6.0. Now Developer A and Developer B both have the exact same versions of Drupal on their system. And still the only files managed by GIT at this point are composer.json and composer.lock. Tying it all together Developers should always run composer.install any time they see that a commit has made changes in the composer.lock file, to ensure that they are on the same codebase as all other developers. Developers should also always run composer.install anytime they switch GIT branches, such as between a production and a staging branch. The dependencies of these branches may be very different, and running composer install will update all dependencies to match the current composer.lock snapshot. The composer update command should only be used to update to new versions of libraries, and the composer.lock file should always be committed after running composer update. Finally, any time a developer adds a new dependency to the project, they need to commit both the composer.json file and the composer.lock file to GIT. Summary Before moving on to the next blog post in this series, you should understand the following: What the composer.json file does What the composer.lock file does When to use composer install When to use composer update How GIT and Composer interact with each other In the next post, coming soon, we'll look specifically at building and managing a Drupal project using composer. [Less]
Posted over 5 years ago
We’re featuring some of the people in the Drupalverse!  This Q&A series highlights some of the individuals you could meet at DrupalCon. First up, Everett Zufelt. 
Posted over 5 years ago
weKnow’s remote working guide to success - Part 2 As a fully distributed company, weKnow supports remote working; a form of management and daily routine that may not be for everyone but, we prove all bumps on the road can be ... [More] successfully sorted out and made our organization even surpass productivity metrics compared to the in-office style. Having a career outside of a traditional office setting comes with unique challenges, getting to know them beforehand will allow you to be more productive and happier. Read further to learn some tips to help you and your team excel.   admin Tue, 11/13/2018 - 23:09 [Less]
Posted over 5 years ago
Drupal exists because of its community. What started from humble beginnings has grown into one of the largest Open Source communities in the world. This is due to the collective effort of thousands of community members. What distinguishes Drupal ... [More] from other open source projects is both the size and diversity of our community, and the many ways in which thousands of contributors and organizations give back. It's a community I'm very proud to be a part of. Without the Drupal community, the Drupal project wouldn't be where it is today and perhaps would even cease to exist. That is why we are always investing in our community and why we constantly evolve how we work with one another. The last time we made significant changes to Drupal's governance was over five years ago when we launched a variety of working groups. Five years is a long time. The time had come to take a step back and to look at Drupal's governance with fresh eyes. Throughout 2017, we did a lot of listening. We organized both in-person and virtual roundtables to gather feedback on how we can improve our community governance. This led me to invest a lot of time and effort in documenting Drupal's Values and Principles. In 2018, we transitioned from listening to planning. Earlier this year, I chartered the Drupal Governance Task Force. The goal of the task force was to draft a set of recommendations for how to evolve and strengthen Drupal's governance based on all of the feedback we received. Last week, after months of work and community collaboration, the task force shared thirteen recommendations (PDF). Me reviewing the Drupal Governance proposal on a recent trip.Before any of us jump to action, the Drupal Governance Task Force recommended a thirty-day, open commentary period to give community members time to read the proposal and to provide more feedback. After the thirty-day commentary period, I will work with the community, various stakeholders, and the Drupal Association to see how we can move these recommendations forward. During the thirty-day open commentary period, you can then get involved by collaborating and responding to each of the individual recommendations below: Create a Community Governance Group Improve collaboration between the Drupal Association and the community Clarify and expand local Drupal Associations Grow the Community Working (CWG) group to offer more support Create a Community Strategic Plan Expand onboarding and mentoring to increase contributor pipeline Provide greater support for in-person events Build a new community website to centralize communication and promote opportunities Create community training offerings to develop leadership skills Define key community terms in clear, translatable language Create a Drupal Community Diversity Statement Improve definitions of representation, leadership, and the expected higher standards Establish processes for handling conflict of interests I'm impressed by the thought and care that went into writing the recommendations, and I'm excited to help move them forward. Some of the recommendations are not new and are ideas that either the Drupal Association, myself or others have been working on, but that none of us have been able to move forward without a significant amount of funding or collaboration. I hope that 2019 will be a year of organizing and finding resources that allow us to take action and implement a number of the recommendations. I'm convinced we can make valuable progress. I want to thank everyone who has participated in this process. This includes community members who shared information and insight, facilitated conversations around governance, were interviewed by the task force, and supported the task force's efforts. Special thanks to all the members of the task force who worked on this with great care and determination for six straight months: Adam Bergstein, Lyndsey Jackson, Ela Meier, Stella Power, Rachel Lawson, David Hernandez and Hussain Abbas. [Less]