Adobe AD0-E716 Exam
Adobe Commerce Developer with Cloud Add-on (Page 2 )

Updated On: 30-Jan-2026

An Adobe Commerce developer has added an iframe and included a JavaScript library from an external domain to the website. After that, they found the following error in the console:
Refused to frame [URL] because it violates the Content Security Policy directive. In order to fix this error, what would be the correct policy ids to add to the csp_whitelist.xml file?

  1. frame-src and script-src
  2. default-src and object-src
  3. frame-ancestors and connect-src

Answer(s): C

Explanation:

The frame-ancestors directive specifies the domains that are allowed to embed the current page in an iframe. The connect-src directive specifies the domains that are allowed to be loaded by the current page through a <script> tag or XMLHttpRequest. In this case, the developer has added an iframe that embeds a page from an external domain. The Content Security Policy (CSP) is preventing the iframe from being loaded because the domain of the external page is not listed in the frame-ancestors directive. To fix this error, the developer needs to add the domain of the external page to the frame-ancestors directive. They can do this by adding the following line to the csp_whitelist.xml file:
<frame-ancestors>https://www.example.com</frame-ancestors>



An Adobe Commerce developer wants to generate a list of products using ProductRepositorylnterf ace and search for products using a supplier_id filter for data that is stored in a standalone table (i.e., not in an EAV attribute).
Keeping maintainability in mind, how can the developer add the supplier ID to the search?

  1. Write a before plugin on \Hagento\catalogVtodel\ProductRepository: :geti_ist() and register the search criteria passed. Write an event observer to 0 listen for the event cataiog_product_coiiection_ioad_before. Iterate through the registered search criteria, and if found, apply the needed join and filter to the events scollection.
  2. Add a CUStOm filter to the Virtual type
    "agento\Catalog\Model\Api\SearchCriteria\CollectionProcessor\ProductFilterProce5sor for supplier_id field. In the custom filter, apply the needed join and filter to the passed $collection.
  3. Write a before plugin On \Magento\Framework\Api\SearchCriteria\CollectionProcessorInterface:
    :process(). Iterate through the $searchCriteria provided for supplier_id, and if found, apply the needed join and filter to the passed scollection.

Answer(s): B

Explanation:

The developer can add a custom filter to the virtual type Magento\Catalog\Model\Api\SearchCriteria\CollectionProcessor\ProductFilterProce5sor for supplier_id field. In the custom filter, the developer can apply the needed join and filter to the passed $collection. This is the recommended way to extend the search criteria for products using dependency injection and plugins. Verified


Reference:

[Magento 2.4 DevDocs] [Magento Stack Exchange]



When researching some issues with the indexer, an Adobe Commerce developer is seeing errors in the logs similar to Memory size allocated for the temporary table is more than 20% of innodb_buffer_pool_size. It is suggested that the client update innodb_buf f er_pool_size or decrease the batch size value.
Why does decreasing the batch size value improve performance?

  1. This decreases memory usage for the temporary table.
  2. This allows for a longer timeout per batch process.
  3. This allows for more PHP threads to be utilized during the process.

Answer(s): A

Explanation:

Decreasing the batch size value improves performance by reducing the memory usage for the temporary table. The batch size value determines how many rows of data are processed at a time by the indexer. A large batch size value can cause the allocated memory size for the temporary table to exceed 20% of innodb_buffer_pool_size, which can result in errors and slow down the indexing process. By lowering the batch size value, the indexer can process the data more efficiently and avoid memory issues. Verified


Reference:

[Magento 2.4 DevDocs] [Magento Stack Exchange]



An international merchant is complaining that changes are taking too long to be reflected on the frontend after a full product import.
Thinking it may be database issues, the Adobe Commerce developer collects the following entity counts:
· Categories: 900
· Products: 300k
· Customers: 700k
· Customer groups : 106
· Orders: 1600k
· Invoices: 500k
· Creditmemos: 50k
· Websites : 15
· Stores : 45
What is a probable cause for this?

  1. The combination of the number of products, categories and stores is too big. This leads to a huge amount of values being stored in the flat catalog indexes which are too large to be processed at a normal speed.
  2. The combination of the number of orders, customers, invoices and creditmemos is too big. This leads to a huge amount of values being stored in the customer grid index which is too large to be processed at a normal speed.
  3. The combination of the number of products, customer groups and websites is too big. This leads to a huge amount of values being stored in the price index which is too large to be processed at a normal speed.

Answer(s): C

Explanation:

The probable cause for the delay in reflecting the changes on the frontend after a full product import is the combination of the number of products, customer groups and websites. This leads to a huge amount of values being stored in the price index which is too large to be processed at a normal speed. The price index calculates the final price of each product for each customer group and website, taking into account various factors such as tax, discounts, catalog price rules, etc.
When there are many products, customer groups and websites, the price index becomes very complex and time-consuming to update. Verified


Reference:

[Magento 2.4 DevDocs] [Magento Stack Exchange]



When checking the cron logs, an Adobe Commerce developer sees that the following job occurs daily: main.INFO: Cron Dob inventory_cleanup_reservations is successfully finished. However, the inventory_reservation table in the database is not emptied.
Why are there records remaining in the inventory_reservation table?

  1. Only reservations matching canceled orders are removed by the cron job.
  2. Only reservations no longer needed are removed by the cron job.
  3. The "Auto Cleanup" feature from Multi Source Inventory was disabled in configuration.

Answer(s): B

Explanation:

The reason why there are records remaining in the inventory_reservation table is that only reservations no longer needed are removed by the cron job. The inventory_reservation table tracks the quantity of each product in each order and creates a reservation for each product when an order is placed, shipped, cancelled or refunded. The initial reservation has a negative quantity value and the subsequent reservations have positive values.
When the order is complete, the sum of all reservations for the product is zero. The cron job removes only those reservations that have a zero sum from the table, leaving behind any reservations that are still needed for incomplete orders. Verified


Reference:

[Magento 2.4 DevDocs] [Magento Stack Exchange]



Viewing page 2 of 15
Viewing questions 6 - 10 out of 69 questions



Post your Comments and Discuss Adobe AD0-E716 exam prep with other Community members:

Join the AD0-E716 Discussion