ssossossosso

Forums

Covering OroCRM topics, including community updates and company announcements.  Subscribe

This topic contains 9 replies, has 3 voices, and was last updated by  atwix 2 years, 7 months ago.

  • Creator
    Topic
  • #6777

    Rodolfo
    Participant

    Hi there!

    I’m trying to sync ORO with Magento but getting some erros on “orders” import.

    This is my command line:

    Yes, you read right, 3GB of memory_limit on my PHP. This is my ‘app/logs/dev.log’:

    I have more errors like this one, but I tried to run this last sql query manually:

    I got this error:

    I removed this index and right now I’m running the sync again. I’ll let you guys know about the results.

    Thanks.

Viewing 9 replies - 1 through 9 (of 9 total)
  • Author
    Replies
  • #6798

    Rodolfo
    Participant

    Tried to disable the SQLDoctrine Logging on config.yml and also tried to run the console command using ‘-e prod’ but the php are still getting memory leaks.
    Now I have this error on my batch file:

    [2014-11-03 20:47:52] batch.WARNING: The Oro\Bundle\IntegrationBundle\ImportExport\Writer\PersistentBatchWriter was unable to handle the following item: [] (REASON: A new entity was found through the relationship ‘Oro\Bundle\WorkflowBundle\Entity\WorkflowItem#currentStep’ that was not configured to cascade persist operations for entity: Oro\Bundle\WorkflowBundle\Entity\WorkflowStep@0000000066ad402c00000000779326c1. To solve this issue: Either explicitly call EntityManager#persist() on this unknown entity or configure cascade persist this association in the mapping for example @ManyToOne(..,cascade={“persist”}). If you cannot find out which entity causes the problem implement ‘Oro\Bundle\WorkflowBundle\Entity\WorkflowStep#__toString()’ to get a clue.
    Following orders were not imported: 100000294, 100000295, 100000296, 100000297, 100000298, 100000299, 100000301, 100000302, 100000303, 100000304, 100000305, 100000306, 100000307, 100000309, 100000310, 100000311, 100000312, 100000313, 100000314, 100000315, 100000316, 100000317, 100000318, 100000319, 100000320) [] []

    #6862

    Rodolfo
    Participant

    So, how can I disable memory increasing on each doctrine exception?

    • Should I force garbage collector? gc_collect_cycles();
    • Should I call Entity Manager methods $this->em->flush(); $this->em->clear();
    #6874
    Yevhen Shyshkin
    Yevhen Shyshkin
    Oro Core

    Hello, Rodolfo Bandeira.

    I would recommend you to test import of big amount of entities in prod mode (–env=prod) – it will automatically disable SQL Logger and will not consume extra memory for caching.

    PersistentBatchWriter already includes clear of EntityManager, so you shouldn’t worry about that.

    We’ve tested Magento integration on ~300k of entities and it worked fine without significant memory leaks. Could you, please, check whether xdebug (and other debug tools) are disabled?

    As for error “A new entity was found through the relationship…” – please, check that you are using stable release version (1.4 is latest one) and you have valid cache. We’ve faced this issue really long time ago and it was fixed.

    #6875

    Rodolfo
    Participant

    Hi Yevhen Shyshkin,

    Thank you for your answer! My xdebug is disabled. I’m using Enterprise Version 1.6. I tried a lot of things but I think you’re right. In my first try to sync, I didn’t pass the parameter ‘-e prod’, so it tried to sync using dev environment.

    My Magento Store has some junk information. Some users for example doesn’t have ‘Last Name’, and when orocrm try to import this data, it generates an doctrine exception saying this field can’t be null. And all this warnings are increasing the php memory.

    So, I’ll try to import again in a different database, using ‘-e prod’ and I’ll come back to report the results. Thanks!

    #6886

    Rodolfo
    Participant

    Hi Yevhen Shyshkin and all Oro Friends,

    I’m running my system on an Amazon AWS m3.medium (That means RAM Memory: 3.75 GiB).
    PHP Version is: (PHP 5.5.9-1ubuntu4.5)
    OROCRM: (Enterprise Edition: 1.6.0-RC1)
    Magento ver. 1.11.2.0 (I removed that fix on Oro Bridge ‘wsdl.xml’ to make this working to this version)
    My php.ini memory_limit is: 2048M
    My xdebug is disabled

    I dropped my database and create everything again. After I started the sync using ‘-e prod’ I got the follow results:

    nohup php /var/www/html/orocrm/app/console oro:cron:integration:sync –integration-id=1 -e prod &

    All my users was imported successfully with only 4 invalid entities.

    When the “Cart” entity started to sync, the php memory wasn’t cleaned and continued increasing. It imported only 16.000 carts and than I got the PHP memory error.


    Loading entity by id: 11832
    Loading entity by id: 11833
    PHP Fatal error: Allowed memory size of 2147483648 bytes exhausted (tried to allocate 72 bytes) in /var/www/html/orocrm/vendor/doctrine/orm/lib/Doctrine/ORM/UnitOfWork.php on line 587

    In my last batch file I have this:

    SQLSTATE[23000]: Integrity constraint violation: 1062 Duplicate entry ‘OroCRM\Bundle\MagentoBundle\Entity\Customer-503’ for key ‘idx_entity’
    Following orders were not imported: 100000806, 100000807, 100000808, 100000809, 100000810, 100000811, 100000812, 100000813, 100000814, 100000815, 100000816, 100000817, 100000818, 100000819, 100000820, 100000821, 100000822, 100000825, 100000826, 100000827, 100000828, 100000829, 100000830, 100000831, 100000832) [] []
    [2014-11-05 22:37:07] batch.WARNING: The Oro\Bundle\IntegrationBundle\ImportExport\Writer\PersistentBatchWriter was unable to handle the following item: [] (REASON: An exception occurred while executing ‘INSERT INTO oro_search_item (entity, alias, record_id, title, changed, created_at, updated_at) VALUES (?, ?, ?, ?, ?, ?, ?)’ with params [“OroCRM\\Bundle\\MagentoBundle\\Entity\\Customer”, “orocrm_magento_customer”, 517, “XXXXXX XXXXXXX”, 0, “2014-11-06 03:37:07”, “2014-11-06 03:37:07”]:

    SQLSTATE[23000]: Integrity constraint violation: 1062 Duplicate entry ‘OroCRM\Bundle\MagentoBundle\Entity\Customer-517’ for key ‘idx_entity’
    Following orders were not imported: 100000833, 100000834, 100000835, 100000836, 100000837, 100000838, 100000839, 100000840, 100000841, 100000842, 100000843, 100000844, 100000845, 100000846, 100000847, 100000848, 100000849, 100000850, 100000851, 100000852, 100000853, 100000854, 100000855, 100000856, 100000857) [] []

    I don’t know what to do anymore. I tried to force $this->em-clean() here and here but neither worked.

    Thanks!

    #6888
    Yevhen Shyshkin
    Yevhen Shyshkin
    Oro Core

    Hello again, Rodolfo Bandeira.

    Thank you for detailed report! Look’s like this issue stably reproduces on your environment and according to description all your actions are valid. We definitely need to do deeper investigation and check two reported issues.

    As for probable reasons – I think you are using mysql DBMS with ORM search engine, and this engine uses MyISAM table to store search data. MyISAM itself does not support transaction, so maybe this is and issue reason.

    You told that you are using Enterprise version – could you, please, try to replace ORM search engine with Elastic Search and do the same import again? If errors will not reproduce then we can be sure that issue in a search mechanism.

    #6980

    Rodolfo
    Participant

    Hi guys. I fixed my specific problem just checking if my orders has some product with name == null. I don’t believe that I should create a pull request for that, but.. If you guys decided to take a look.. here is the patch:

    #7000
    Yevhen Shyshkin
    Yevhen Shyshkin
    Oro Core

    Hello, Rodolfo Bandeira.

    Thank you for sharing your results and for attached patch! I’ve created improvement for that – probably, we’ll fix it in our next releases.

    #7090

    atwix
    Participant

    Have the same issue with attempt to import 170k orders with 2Gb of RAM. Xdebug is not installed, –env=prod is specified.
    It would be great to know how we can restart the import process from a specific entity. For example, if script goes out of memory on entity (order) 71093 – make the next script execution from that entity.
    Trying to find out.

Viewing 9 replies - 1 through 9 (of 9 total)

You must be logged in to reply to this topic.