In the last article, we started migrating nodes. Today, we’ll continue with the three content types that are pending to import. This is the 30th article in the series so the migrations should feel familiar. To keep things interesting, in addition to explaining how to perform those node migrations, we’ll also discuss other topics, like how the high water property is set up, strategies to perform quality assurance, and the benefits of enabling entity validation in content migrations. Ready?
Alias configuration in high water property
We normally start each article with a note on how to avoid entity ID conflicts and setting up a high water mark for the migrations that will be covered. Today, we continue with node migrations so what we explained last time applies here, too. Instead, let's talk about something that we have seen many times but never stopped to explain. The alias
configuration when setting up the high water property.
Consider the following snippet copied from our node migrations:
source:
key: migrate
plugin: tag1_d7_node
node_type: page
high_water_property:
name: vid
alias: nr
The high_water_property
consists of an array of two values. As explained in other articles, name
is set to a timestamp or serial field returned by the source plugin. In this case, vid
is a reference to a base field definition of the node entity that represents the node's revision ID. But, what is the purpose of the alias
configuration and where does the nr
value come from?
It’s possible that the source plugin retrieves data from multiple tables. An alias is assigned to each table when assembling the query in the source plugin. Among other things, this allows for the disambiguation of fields in case the same field name exists in more than one table.
Consider the following snippet containing part of the query
implementation of the d7_node
source plugin:
/**
* Drupal 7 node source from database.
*
* @see \Drupal\node\Plugin\migrate\source\d7\Node
*
* @MigrateSource(
* id = "d7_node",
* source_module = "node"
* )
*/
class Node extends FieldableEntity {
/**
* The join options between the node and the node_revision table.
*/
const JOIN = '[n].[vid] = [nr].[vid]';
/**
* {@inheritdoc}
*/
public function query() {
// Select node in its last revision.
$query = $this->select('node_revision', 'nr')
->fields('n', [
'nid',
'type',
'language',
'status',
'created',
'changed',
'comment',
'promote',
'sticky',
'tnid',
'translate',
])
->fields('nr', [
'vid',
'title',
'log',
'timestamp',
]);
$query->addField('n', 'uid', 'node_uid');
$query->addField('nr', 'uid', 'revision_uid');
$query->innerJoin('node', 'n', static::JOIN);
// More code.
return $query;
}
}
Refer to this documentation page to learn more about dynamic queries in Drupal. What is important to note is that the source plugin queries the node
and node_revision
tables from Drupal 7. Both tables have a vid
field. In fact, they have many fields in common: nid
, uid
, title
, status
, comment,
promote
, sticky
.
When the query is built, we need a way to indicate which table to fetch a field from and that is what the table alias is used for. In the code above, the alias is the second argument to the select
and innerJoin
methods. Those table aliases are the ones we can use when setting up the high_water_property
. You can see from the snippet above that vid
is being retrieved from the node_revision
table whose alias is nr
.
Technical note: It’s possible to retrieve a field common to multiple tables as long as you define field aliases to be able to tell them apart. In the snippet above, uid
from the node
table is retrieved as node_uid
while uid
from the node_revision
is retrieved as revision_uid
.
Migrating venue nodes
Now that we’ve clarified the alias for the high water property, let’s continue by migrating venues. We’re using upgrade_d7_node_venue
so copy it from the reference folder into our tag1_migration
custom module and rebuild caches for the migration to be detected.
cd drupal10
cp ref_migrations/migrate_plus.migration.upgrade_d7_node_venue.yml web/modules/custom/tag1_migration/migrations/upgrade_d7_node_venue.yml
ddev drush cache:rebuild
If you do not have a migrate_plus.migration.upgrade_d7_node_venue.yml
file in the ref_migrations
migrations folder, it’s likely that you have a file named migrate_plus.migration.upgrade_d7_node_complete_page.yml
instead. That would mean that you used the node complete approach, instead of the classic one, when performing the automated migration. Feel free to proceed with our instructions using that file. Ultimately, we are going to update the source plugin to use our custom one. The file name or plugin ID has no effect on what data is retrieved. That will depend on which source plugin is used and how it is configured.
Note that while copying the file, we also changed its name and placed it in a migrations
folder inside our tag1_migration
custom module. After copying the file, make the following changes:
- Remove the following keys:
uuid
,langcode
,status
,dependencies
,field_plugin_method
,cck_plugin_method
, andmigration_group
. - Add two migration tags:
node
andtag1_content
. - Add
key: migrate
under the source section. - Change the source plugin to configuration to use our custom
tag1_d7_node
plugin. - Add the
high_water_property
property as demonstrated above. -
Update the mapping of the
field_media_image
media field in theprocess
section as we did in the previous article. - Update the migration dependencies so that
upgrade_d7_media_image
andupgrade_d7_user
are listed as required dependencies.
We also need to account for changes in text formats. In article 22, we decided not to migrate Drupal 7 text formats and instead leverage those that Drupal 10 provides out of the box. In practice, this means that the filtered_html
format used in Drupal 7 no longer exists. In Drupal 10, the migration of any rich text field will have to be updated to map filtered_html
to a valid text format in the new site. Using the snippet below, we replace it with the restricted_html
text format in the field_additional_information
field:
process:
field_additional_information:
-
plugin: sub_process
source: field_additional_information
process:
value: value
format:
-
plugin: static_map
source: format
map:
filtered_html: restricted_html
bypass: TRUE
Note: You can use the snippet process proposed for the migrate plus module to create reusable process pipelines for use in your migrations.
After the modifications, the upgrade_d7_node_venue.yml
file should look like this:
id: upgrade_d7_node_venue
class: Drupal\migrate\Plugin\Migration
migration_tags:
- 'Drupal 7'
- Content
- node
- tag1_content
label: 'Nodes (Venue)'
source:
key: migrate
plugin: tag1_d7_node
node_type: venue
high_water_property:
name: vid
alias: nr
process:
nid:
-
plugin: get
source: tnid
vid:
-
plugin: get
source: vid
langcode:
-
plugin: default_value
source: language
default_value: und
title:
-
plugin: get
source: title
uid:
-
plugin: get
source: node_uid
status:
-
plugin: get
source: status
created:
-
plugin: get
source: created
changed:
-
plugin: get
source: changed
promote:
-
plugin: get
source: promote
sticky:
-
plugin: get
source: sticky
revision_uid:
-
plugin: get
source: revision_uid
revision_log:
-
plugin: get
source: log
revision_timestamp:
-
plugin: get
source: timestamp
comment_node_venue/0/status:
-
plugin: get
source: comment
field_address:
-
plugin: addressfield
source: field_address
field_media_image:
-
plugin: sub_process
source: field_image
process:
target_id:
-
plugin: migration_lookup
source: fid
migration: upgrade_d7_media_image
no_stub: true
field_phone:
-
plugin: get
source: field_phone
field_additional_information:
-
plugin: sub_process
source: field_additional_information
process:
value: value
format:
-
plugin: static_map
source: format
map:
filtered_html: restricted_html
bypass: TRUE
destination:
plugin: 'entity:node'
default_bundle: venue
migration_dependencies:
required:
- upgrade_d7_media_image
- upgrade_d7_user
optional: { }
Now, rebuild caches for our changes to be detected and execute the migration. Run migrate:status
to make sure we can connect to Drupal 7. Then, run migrate:import
to perform the import operations.
ddev drush cache:rebuild
ddev drush migrate:status upgrade_d7_node_venue
ddev drush migrate:import upgrade_d7_node_venue
If things are properly configured, you should not get any errors. Go to https://migration-drupal10.ddev.site/admin/content?type=venue
and look at the list of migrated venue nodes. More important, though, is what you do not see. Drupal 7's "Test venue" node with nid
21 should not exist in the migrated site. If it appears, make sure you update the source plugin to use tag1_d7_node
, rebuild caches, rollback, and import again.
You should see the media reference field populated. If that is not the case, verify that you replaced mapping of field_image
as generated by the migration with the process
pipeline for the field_media_image
as described in the previous article. Remember that our migration plan asks for media reference fields in place of image fields. The generated migration would not account for that automatically so we need to make this change ourselves.
Another thing to check is that address information was migrated properly. In Drupal 7, there were multiple options to store address-related information. Two examples are the address field and location modules. Our example Drupal 7 project uses the former. In Drupal 10, we use the address module, which provides an automated upgrade path for the address field
module. If your Drupal 7 project uses the location
module, you can look at the Location Migration module for reference. This article includes more information on migrating address fields in Drupal 10.
Migrating session nodes
We use upgrade_d7_node_session
to migrate sessions. Copy it from the reference folder into our tag1_migration
custom module and rebuild caches for the migration to be detected.
cd drupal10
cp ref_migrations/migrate_plus.migration.upgrade_d7_node_session.yml web/modules/custom/tag1_migration/migrations/upgrade_d7_node_session.yml
ddev drush cache:rebuild
If you do not have a migrate_plus.migration.upgrade_d7_node_session.yml
file in the ref_migrations
migrations folder, it’s likely that you have a file named migrate_plus.migration.upgrade_d7_node_session.yml
instead. That would mean that you used the node complete approach, instead of the classic one, when performing the automated migration. Feel free to proceed with our instructions using that file. Ultimately, we are going to update the source plugin to use our custom one. The file name or plugin ID has no effect on what data is retrieved. That will depend on which source plugin is used and how it is configured.
Note that while copying the file, we also changed its name and placed it in a migrations
folder inside our tag1_migration
custom module. After copying the file, make the following changes:
- Remove the following keys:
uuid
,langcode
,status
,dependencies
,field_plugin_method
,cck_plugin_method
, andmigration_group
. - Add two migration tags:
node
andtag1_content
. - Add
key: migrate
under the source section. - Change the source plugin to configuration to use our custom
tag1_d7_node
plugin. - Add the
high_water_property
property as demonstrated above. - Apply the same treatment to account for text format changes in the
field_description
field as we did with thefield_additional_information
field in the venue migration. - Update the migration dependencies so that
upgrade_d7_file
,upgrade_d7_media_remote_video
,upgrade_d7_node_speaker_to_user
,upgrade_d7_taxonomy_term
, andupgrade_d7_user
are listed as required dependencies.
Before running the migration, there are two changes we need to make to accommodate content model changes. The first is for the speakers field. In Drupal 7, speakers were nodes. Back in article 26, speakers were migrated as users. The name of the field in the session
content type is the same between both versions, but they point to different types of entities. Update the process pipeline for field_speakers
with the following code snippet:
process:
field_speakers:
-
plugin: sub_process
source: field_speakers
process:
target_id:
-
plugin: migration_lookup
source: target_id
migration: upgrade_d7_node_speaker_to_user
no_stub: true
In the previous article, we explained how migration lookup operations work. You can also refer to article 16 and this presentation to better understand the database table structure for Drupal fields and how that is relevant to field mappings in migrations. In the snippet above, we are using the target_id
sub-field of Drupal 7's field_speakers
to perform a lookup operation against the upgrade_d7_node_speaker_to_user
migration we created in article 26. This will return the uid
of the migrated user and assign it to the target_id
sub-field of Drupal 10's field_speakers
, establishing the relationship to the user entity as expected in the new site.
The second change we need to do is related to the field that stores the video recording for the session. In Drupal 7, we used the YouTube field module. In Drupal 10, we decided to leverage media entities, which are supported out of the box. In article 28, we learned how to create remote video media entities in Drupal 10 out of Drupal 7 field data. The name of the field also changed from field_video_recording
to field_media_remote_video
. Replace the mapping of field_video_recording
from the generated migration with the snippet below:
process:
field_media_remote_video:
-
plugin: sub_process
source: field_video_recording
process:
target_id:
-
plugin: migration_lookup
source: video_id
migration: upgrade_d7_media_remote_video
no_stub: true
The upgrade_d7_media_remote_video
migration uses the video_id
sub-field of Drupal 7's YouTube field module as its identifier. So, we use that as the source for the lookup operation to retrieve corresponding media ID and assign it to the target_id
sub-field of the field_media_remote_video
field in the new site.
After the modifications, the upgrade_d7_node_session.yml
file should look like this:
id: upgrade_d7_node_session
class: Drupal\migrate\Plugin\Migration
migration_tags:
- 'Drupal 7'
- Content
- node
- tag1_content
label: 'Nodes (Session)'
source:
key: migrate
plugin: tag1_d7_node
node_type: session
high_water_property:
name: vid
alias: nr
process:
nid:
-
plugin: get
source: tnid
vid:
-
plugin: get
source: vid
langcode:
-
plugin: default_value
source: language
default_value: und
title:
-
plugin: get
source: title
uid:
-
plugin: get
source: node_uid
status:
-
plugin: get
source: status
created:
-
plugin: get
source: created
changed:
-
plugin: get
source: changed
promote:
-
plugin: get
source: promote
sticky:
-
plugin: get
source: sticky
revision_uid:
-
plugin: get
source: revision_uid
revision_log:
-
plugin: get
source: log
revision_timestamp:
-
plugin: get
source: timestamp
comment_node_session/0/status:
-
plugin: get
source: comment
field_speakers:
-
plugin: sub_process
source: field_speakers
process:
target_id:
-
plugin: migration_lookup
source: target_id
migration: upgrade_d7_node_speaker_to_user
no_stub: true
field_description:
-
plugin: sub_process
source: field_description
process:
value: value
format:
-
plugin: static_map
source: format
map:
filtered_html: restricted_html
bypass: TRUE
field_media_remote_video:
-
plugin: sub_process
source: field_video_recording
process:
target_id:
-
plugin: migration_lookup
source: video_id
migration: upgrade_d7_media_remote_video
no_stub: true
field_slides:
-
plugin: sub_process
source: field_slides
process:
target_id: fid
display: display
description: description
field_topics:
-
plugin: sub_process
source: field_topics
process:
target_id: tid
destination:
plugin: 'entity:node'
default_bundle: session
migration_dependencies:
required:
- upgrade_d7_file
- upgrade_d7_media_remote_video
- upgrade_d7_node_speaker_to_user
- upgrade_d7_taxonomy_term
- upgrade_d7_user
optional: { }
Now, rebuild caches for our changes to be detected and execute the migration. Run migrate:status
to make sure we can connect to Drupal 7. Then, run migrate:import
to perform the import operations.
ddev drush cache:rebuild
ddev drush migrate:status upgrade_d7_node_session
ddev drush migrate:import upgrade_d7_node_session
If things are properly configured, you should not get any errors and all fields should be populated. Go to https://migration-drupal10.ddev.site/admin/content?type=session
and look at the list of migrated session nodes. More important though is what you do not see. Drupal 7's "Test session" node with nid
46 should not exist in the migrated site. If it appears, make sure you update the source plugin to use tag1_d7_node
, rebuild caches, rollback, and import again.
Migrating event nodes
Next, we use upgrade_d7_node_event
to migrate events. Copy it from the reference folder into our tag1_migration
custom module and rebuild caches for the migration to be detected.
cd drupal10
cp ref_migrations/migrate_plus.migration.upgrade_d7_node_event.yml web/modules/custom/tag1_migration/migrations/upgrade_d7_node_event.yml
ddev drush cache:rebuild
If you do not have a migrate_plus.migration.upgrade_d7_node_event.yml
file in the ref_migrations
migrations folder, it’s likely that you have a file named migrate_plus.migration.upgrade_d7_node_event.yml
instead. That would mean that you used the node complete approach, instead of the classic one, when performing the automated migration. Feel free to proceed with our instructions using that file. Ultimately, we are going to update the source plugin to use our custom one. The file name or plugin ID has no effect on what data is retrieved. That will depend on which source plugin is used and how it is configured.
Note that while copying the file, we also changed its name and placed it in a migrations
folder inside our tag1_migration
custom module. After copying the file, make the following changes:
- Remove the following keys:
uuid
,langcode
,status
,dependencies
,field_plugin_method
,cck_plugin_method
, andmigration_group
. - Add two migration tags:
node
andtag1_content
. - Add
key: migrate
under the source section. - Change the source plugin to configuration to use our custom
tag1_d7_node
plugin. - Add the
high_water_property
property as demonstrated above. - Apply the same treatment to account for text format changes in the
field_description
field as we did with thefield_additional_information
field in the venue migration. - Update the migration dependencies so that
upgrade_d7_node_session
,upgrade_d7_node_sponsor_to_taxonomy_term
,upgrade_d7_node_venue
, andupgrade_d7_user
are listed as required dependencies.
Before running the migration, we need to accommodate another content model change. In Drupal 7, sponsors were nodes but, remember, speakers were migrated as taxonomy terms in article 26. The name of the field in the event
content type is the same between both versions, but they point to different types of entities. Update the process pipeline for field_sponsors
with the following code snippet:
process:
field_sponsors:
-
plugin: sub_process
source: field_sponsors
process:
target_id:
-
plugin: migration_lookup
source: target_id
migration: upgrade_d7_node_sponsor_to_taxonomy_term
no_stub: true
After the modifications, the upgrade_d7_node_event.yml
file should look like this:
id: upgrade_d7_node_event
class: Drupal\migrate\Plugin\Migration
migration_tags:
- 'Drupal 7'
- Content
- node
- tag1_content
label: 'Nodes (Event)'
source:
key: migrate
plugin: tag1_d7_node
node_type: event
high_water_property:
name: vid
alias: nr
process:
nid:
-
plugin: get
source: tnid
vid:
-
plugin: get
source: vid
langcode:
-
plugin: default_value
source: language
default_value: und
title:
-
plugin: get
source: title
uid:
-
plugin: get
source: node_uid
status:
-
plugin: get
source: status
created:
-
plugin: get
source: created
changed:
-
plugin: get
source: changed
promote:
-
plugin: get
source: promote
sticky:
-
plugin: get
source: sticky
revision_uid:
-
plugin: get
source: revision_uid
revision_log:
-
plugin: get
source: log
revision_timestamp:
-
plugin: get
source: timestamp
comment_node_event/0/status:
-
plugin: get
source: comment
field_description:
-
plugin: sub_process
source: field_description
process:
value: value
format:
-
plugin: static_map
source: format
map:
filtered_html: restricted_html
bypass: TRUE
field_sessions:
-
plugin: get
source: field_sessions
field_venue:
-
plugin: get
source: field_venue
field_type:
-
plugin: get
source: field_type
field_sponsors:
-
plugin: sub_process
source: field_sponsors
process:
target_id:
-
plugin: migration_lookup
source: target_id
migration: upgrade_d7_node_sponsor_to_taxonomy_term
no_stub: true
field_date:
-
plugin: sub_process
source: field_date
process:
value:
plugin: format_date
from_format: 'Y-m-d H:i:s'
to_format: Y-m-d
source: value
destination:
plugin: 'entity:node'
default_bundle: event
migration_dependencies:
required:
- upgrade_d7_node_session
- upgrade_d7_node_sponsor_to_taxonomy_term
- upgrade_d7_node_venue
- upgrade_d7_user
optional: { }
Technical note: You can improve the process pipeline for the field_sessions
and field_venue
fields by performing lookup operations against the upgrade_d7_node_session
and upgrade_d7_node_venue
migrations, respectively. This will make sure that only nodes that were successfully imported into Drupal 10 will be referenced. Revisit the previous article for a detailed explanation of how migration lookup operations work.
Now, rebuild caches for our changes to be detected and execute the migration. Run migrate:status
to make sure we can connect to Drupal 7. Then, run migrate:import
to perform the import operations.
ddev drush cache:rebuild
ddev drush migrate:status upgrade_d7_node_event
ddev drush migrate:import upgrade_d7_node_event
If things are properly configured, you should not get any errors and all fields should be populated. Go to https://migration-drupal10.ddev.site/admin/content?type=event
and look at the list of migrated event nodes. More important though is what you do not see. Drupal 7's "Test event" node with nid
73 should not exist in the migrated site. If it appears, make sure you update the source plugin to use tag1_d7_node
, rebuild caches, rollback, and import again.
Before wrapping up this section, I would like to point out that the event content type includes a date field. The automated migration created a process pipeline that imports its data. Refer to this article to learn more about migrating dates in Drupal 10.
Speaking of dates, take a look at nodes with nid
62 and 63 in the new site. They have no date set even though it is a required field. Is it an error in the migration? Let's use this opportunity to discuss quality assurance and entity validation in Drupal migrations.
Quality assurance in Drupal migrations
Like other aspects of a Drupal project, the result of the migration needs to be thoroughly tested for quality assurance. Back in article 18, we saw an example where working on new migrations uncovered the need to make manual adjustments to previously migrated configuration. While writing today's migrations, I noticed that I had not accounted for text format changes in the migration of basic pages and articles nodes. Upon realizing the omission, I went back and updated the previous entry in the series.
Going back to the missing dates in the migrated event
nodes, all nodes except those with nid
62 and 63 have a value for the date field. Any guesses about what the issue with those two nodes could be? Where would you start debugging the issue? Maybe the generated process pipeline is wrong after all?
What if I told you there is nothing wrong with our migration? To prove this, review the same nodes in Drupal 7 and you see that our source site also has no date set for those nodes even though the field is required. How that happened is not as relevant as the fact that it’s impossible to migrate data that never existed. This may seem like a farfetched example, but I have seen this happen in real life projects. That is, people asking why a field for a specific node or user is not set. It’s because the field for that particular node or user has no value.
There are multiple ways to assess the correctness of your migrations. Below is a non-exhaustive lists of tools and techniques I have used in the past:
- Revisit the Drupal 7 site audit template. Introduced in article 3 and further explored in article 8, populating the template is one of the very first things I do in every migration project. It can be used to keep track of decisions about what elements to migrate, what to skip, and what will undergo content model changes. One time when the template was extremely useful was when upgrading an application that used over 100 fields in a single content type. I was able to use the field instance report in the template to filter the list of fields per type. That helped me keep track of the fields that needed to be migrated and make sure I have accounted for them all.
- Use the Migrate QA module. It keeps track of every single migrated content entity, allows them to be reviewed by multiple stakeholders, and flags them depending on their approval status. Refer to the module's documentation for more information on how to implement it.
- Identify content with data in the fields to validate. For this, you can create a view and include a filter criteria that ensures the fields under testing have a value. If done in Drupal 10, remember that views are configuration entities. If used exclusively for testing the migration, it’s very likely that you do not want to preserve the view. Make sure to delete it when you are done testing so it does not get exported with the rest of your site's configuration. Alternatively, you can query data from Drupal 7 and 10 databases directly. This requires some understanding of Drupal’s database structure, but it’s a very efficient way to find entities that match exactly your test criteria.
- Create test content in the source site. Usually done in a local copy of the site, you can create test content where all the fields that you are interested in testing are populated. If you happen to have conditional fields, or business logic that affect how multiple fields behave, creating sample content will save you time. This approach is useful to narrow down the number of imported content that needs to be reviewed after executing the migrations.
No matter which approach you take, make sure to involve the client in the quality assurance process. They will have a deeper understanding of the content model and any business logic that might be in place. Among other things, you can request the client create a list of the most important pages or sections of the website or application. The development team can pay extra attention to the migrations involved in producing content for the list provided by the client.
Throughout all your testing efforts, make sure to strike a balance. At some point, there will be diminishing returns. Consider time and budget when deciding how much and for how long you should test the migrations.
Content entity validation
There is one thing you can do to dramatically improve the quality of the migrated content: enable entity validation. All the destination plugins we have used so far in the series are subclasses of the EntityContentBase destination plugin, which exposes the validate
configuration option. When set to TRUE
, the Migrate API will call the validate method of the entity API before importing a record. In practice, this means all validation constraints are enforced at the entity and field level.
The advantage of this is that you are guaranteed to have content that respects the validation put in place in the new site. The disadvantage is that the checks are very strict and content that does not comply is not migrated at all. Consider the following examples of migrations with entity validation on:
- Required fields: Drupal will refuse to import content if a single required field is missing (or it has an invalid value).
- Text formats: Drupal will check if the author of the node has permission to use the text format specified in rich text fields.
-
Relationship among migrations: Drupal will fail to import a media entity that points to a file whose status is not permanent. This example is interesting because migrating non-permanent files (those with a
status
value of0
) does not produce a validation error in the file migration. But attempting to migrate media entities that point to non-permanent files will fail validation.
In all these cases, the whole content entity (node, user, taxonomy term, media, etc.) will be skipped. Having entity validation enabled is useful to detect errors, especially early on in the project. But it often becomes impractical because of issues in the source data that cannot be fixed. In our example, setting a date for the two nodes would be quick and easy. In real life projects, it might be impractical to retroactively populate fields for content that had been created over the years. In some cases, it would be straight out impossible. Let's say a new required field was introduced to a content type years after it was initially created. Old content might not have a suitable value to use for the new field.
On this topic, I fondly remember migrating a Drupal 6 project with very little field validation in place. This was a humbling learning experience because things I took for granted from Drupal 7 did not exist in Drupal 6. The muscle memory I had developed around the structure of the database no longer applied. Content types and CCK fields had a different table structure back then. Moreover, a lot of content was entered in free-form text fields. We had to parse unstructured data and perform complex data transformations to make it fit the rigid content structure defined in the new site.
Before closing, I want to clarify that by no means I pretend to paint a bad picture about entity validation. I strive to use it in all my migration projects. I turn it off only when it’s impossible or impractical to keep it enabled. Even in those cases, it already helped identify errors in my migration or inconsistencies in the source data. When a content entity fails to import due to validation errors, it logs a detailed report to the migration message table, which can be retrieved using the drush migration:messages
command.
Are you up for a challenge? Enable entity validation for all the content migrations in the series and fix any issues that might arise. For the node migrations, you would configure the destination plugin like this:
destination:
plugin: 'entity:node'
default_bundle: DRUPAL_10_CONTENT_TYPE_MACHINE_NAME
validate: true
Today, we wrote our 25th migration. Congrats on making it this far! Your perseverance is worthy of admiration. There is only one more article to go. Ready for the season finale?
Image by Entre_Humos from Pixabay