cubicweb #3685463 Behavior is different in migration script and in shell [validation pending]

on an apycot migration, a query fails while runnung the migration script. If the script is stopped juste before the query and the same request is launched inside a cubicweb shell, the query succeeds. This seems linked to the workflow creation.

Traceback (run of the migration script followed by the same query in the shell):

Upgrading the instance ptonelli-jpl
-----------------------------------
/usr/lib/python2.7/dist-packages/cubicweb/rset.py:547: Warning: [yams 0.38] no targettype specified and there are several relation definitions for rtype cw_schema: ['CWRelation', 'CWEType', 'CWRType']. Yet you get the first rdef.
  rdef = eschema.rdef(attr, role)
-> migration needed from 0.39.1 to 0.40.0 for jpl
-> migration needed from 1.10.0 to 1.10.1 for forge
-> migration needed from 2.2.7 to 3.0.0 for apycot
-> migration needed from 0.4.0 to 0.4.1 for mercurial_server
-> migration needed from 0.4.1 to 0.4.2 for testcard
-> migration needed from 3.2.1 to 4.0.0 for narval
-> migration needed from 1.14.0 to 1.15.2 for tracker
-> migration needed from 1.9.1 to 1.10.0 for comment
-> migration needed from 1.15.0 to 1.16.0 for file
-> migration needed from 1.7.0 to 1.7.1 for link
-> migration needed from 1.7.1 to 1.7.2 for mailinglist
-> migration needed from 1.0.0 to 1.1.0 for preview
-> migration needed from 1.8.0 to 1.8.1 for tag
-> migration needed from 1.8.0 to 1.8.1 for task
-> migration needed from 3.17.10 to 3.18.3 for cubicweb
/usr/lib/python2.7/dist-packages/cubicweb/schema.py:717: Warning: [yams 0.38] no targettype specified and there are several relation definitions for rtype defaultval: ['String', 'Bytes']. Yet you get the first rdef.
  if self.rdef(rschema).get('fulltextindexed'):
dropping index unique_cw_cwsourcehostconfig_cw_cw_host_config_of_cw_match_host
dropping index unique_cw_insertionpoint_cw_lid_cw_point_of_idx
dropping index unique_cw_mercurialserverpermission_cw_access_key_cw_granted_on
dropping index unique_cw_revision_cw_changeset_cw_from_repository_idx
dropping index unique_cw_revision_cw_from_repository_cw_revision_idx
dropping index unique_cw_trinfo_cw_tr_count_cw_wf_info_for_idx
dropping index unique_cw_version_cw_num_cw_version_of_idx
dropping index unique_cw_versionedfile_cw_directory_cw_from_repository_cw_name
WARNING: attribute for_each missing from fs schema
WARNING: attribute from_context missing from fs schema
WARNING: attribute execution_status missing from fs schema
WARNING: attribute optional missing from fs schema
WARNING: attribute use missing from fs schema
WARNING: attribute to_context missing from fs schema
WARNING: attribute outdates missing from fs schema
WARNING: attribute target missing from fs schema
WARNING: attribute arguments missing from fs schema
-> installing /usr/lib/python2.7/dist-packages/cubicweb/schemas/_regproc.postgres.sql
warning: attribute Task cost is not known, skip deletion
warning: attribute Task author is not known, skip deletion
warning: relation type todo_by is already known, skip addition
warning: relation Task todo_by CWUser is already known, skip addition
-> installing /usr/share/cubicweb/cubes/narval/schema/_regproc.postgres.sql
warning: relation TestExecution execution_log File is already known, skip addition
warning: relation type execution_archive is already known, skip addition
warning: relation TestExecution execution_archive File is already known, skip addition
Traceback (most recent call last):
  File "/usr/lib/python2.7/dist-packages/cubicweb/cwctl.py", line 162, in run_arg
    status = cmdmeth(appid)
  File "/usr/lib/python2.7/dist-packages/cubicweb/cwctl.py", line 784, in upgrade_instance
    mih.migrate(vcconf, reversed(toupgrade), self.config)
  File "/usr/lib/python2.7/dist-packages/cubicweb/server/migractions.py", line 152, in migrate
    super(ServerMigrationHelper, self).migrate(vcconf, toupgrade, options)
  File "/usr/lib/python2.7/dist-packages/cubicweb/migration.py", line 181, in migrate
    self.cmd_process_script(script)
  File "/usr/lib/python2.7/dist-packages/cubicweb/server/migractions.py", line 157, in cmd_process_script
    migrscript, funcname, *args, **kwargs)
  File "/usr/lib/python2.7/dist-packages/cubicweb/migration.py", line 350, in cmd_process_script
    execfile(migrscript, scriptlocals)
  File "/usr/share/cubicweb/cubes/apycot/migration/3.0.0_Any.py", line 33, in <module>
    te_eids = rql('Any X WHERE '
  File "/usr/lib/python2.7/dist-packages/cubicweb/server/migractions.py", line 1479, in rqlexec
    res = execute(rql, kwargs, build_descr=build_descr)
  File "/usr/lib/python2.7/dist-packages/cubicweb/server/session.py", line 1183, in execute
    rset = self._execute(self, rql, kwargs, build_descr)
  File "/usr/lib/python2.7/dist-packages/cubicweb/server/querier.py", line 603, in execute
    eidkeys = self.solutions(session, rqlst, args)
  File "/usr/lib/python2.7/dist-packages/cubicweb/cwvreg.py", line 705, in solutions
    return self.rqlhelper.compute_solutions(rqlst, {'eid': type_from_eid}, args)
  File "/usr/lib/pymodules/python2.7/rql/__init__.py", line 107, in compute_solutions
    debug)
  File "/usr/lib/pymodules/python2.7/rql/analyze.py", line 399, in visit
    self._visit(node)
  File "/usr/lib/pymodules/python2.7/rql/analyze.py", line 353, in _visit
    func(node)
  File "/usr/lib/pymodules/python2.7/rql/analyze.py", line 407, in visit_union
    self._visit(select)
  File "/usr/lib/pymodules/python2.7/rql/analyze.py", line 353, in _visit
    func(node)
  File "/usr/lib/pymodules/python2.7/rql/analyze.py", line 473, in visit_select
    self.solve(node, constraints)
  File "/usr/lib/pymodules/python2.7/rql/analyze.py", line 342, in solve
    raise TypeResolverException(ex_msg)
TypeResolverException: Unable to resolve variables types in "Any X WHERE X is TestExecution, NOT EXISTS(X in_state S)"

instance ptonelli-jpl not upgraded: Unable to resolve variables types in "Any X WHERE X is TestExecution, NOT EXISTS(X in_state S)"







root@ptonelli-test-jpl:~# cubicweb-ctl shell ptonelli-jpl -f/usr/lib/python2.7/dist-packages/cubicweb/rset.py:547: Warning: [yams 0.38] no targettype specified and there are several relation definitions for rtype cw_schema: ['CWRelation', 'CWEType', 'CWRType']. Yet you get the first rdef.
  rdef = eschema.rdef(attr, role)
entering the migration python shell
just type migration commands or arbitrary python code and type ENTER to execute it
type "exit" or Ctrl-D to quit the shell and resume operation
>>> te_eids = rql('Any X WHERE '
...     'X is TestExecution, '
...     'NOT EXISTS(X in_state S)'
...     )
>>> te_eids
<resultset 'Any X WHERE X is TestExecution, NOT EXISTS(X in_state S)' (119 rows):
[590063] (('TestExecution',))
[521627] (('TestExecution',))
[600467] (('TestExecution',))
[609413] (('TestExecution',))
[578327] (('TestExecution',))
[628441] (('TestExecution',))
[607180] (('TestExecution',))
[628436] (('TestExecution',))
[600682] (('TestExecution',))
[625656] (('TestExecution',))
... (('TestExecution',))>
>>>

The narval migration script (creates the workflow for plan):

# -*- coding: utf-8 -*-
from cubicweb import Binary
from cubicweb.server.session import Session

# TestExecution now have a workflow
rql('SET WF workflow_of TE, TE default_workflow WF WHERE WF workflow_of P, '
    'P name "Plan", TE name "TestExecution"')
commit()

log_list = []


if confirm('Upgrade all execution_logs to file objects ()?'):
    for plan in rql('Any X WHERE X is TestExecution').entities():
        if plan.execution_log is not None:
            if isinstance(plan.execution_log,(tuple,list)):
                execution_log = ''.join(plan.execution_log)
            else:
                execution_log = plan.execution_log
            assert isinstance(execution_log, str)
            log_file = create_entity('File', data_name=u"execution_log.txt",
                    data=Binary(execution_log.encode('utf-8')),
                    data_encoding='utf-8',
                    ask_confirm=False)
            log_list.append((plan.eid, log_file.eid))

# transform log into log_files
rset = rql('Any X, F WHERE X is IN (TestExecution, CheckResult), X log_file F')
rename_relation('log_file','execution_archive')
commit()
import pdb;pdb.set_trace()
# for each TestExecution, Plan ...
te_eids = rql('DISTINCT Any X WHERE '
    'X is TestExecution, '
    'NOT EXISTS(X in_state S)'
    )

# set it in correct initial state
for (eid,) in te_eids:
    print 'found entity to make ready'
    session.entity_from_eid(eid).cw_adapt_to('IWorkflowable').set_initial_state('ready')

commit()

#fire all necessary transitions
narval_user = rql('Any U WHERE U in_group G, G name "narval"').get_entity(0,0)
narval_session = Session(narval_user, repo)
narval_session.set_cnxset()

to_update=[]
for (eid,) in te_eids:
        te = narval_session.entity_from_eid(eid)
        if te.starttime:
            tri = te.cw_adapt_to('IWorkflowable').fire_transition('start')
            to_update.append((tri.eid, te.starttime))
        if te.endtime:
            if te.status == 'success':
                tri = te.cw_adapt_to('IWorkflowable').fire_transition('end')
                to_update.append((tri.eid, te.endtime))
            elif te.status == 'killed':
                tri = te.cw_adapt_to('IWorkflowable').fire_transition('kill')
                to_update.append((tri.eid, te.endtime))
            else:
                tri = te.cw_adapt_to('IWorkflowable').fire_transition('fail')
                to_update.append((tri.eid, te.endtime))

narval_session.commit()
narval_session.close()

# set correct timestamps
for eid, date in to_update:
    rql('SET TI creation_date %(date)s WHERE TI eid %(eid)s ', dict(date=date, eid=eid))
commit()

add_relation_type('log_file')
drop_attribute('CheckResult', 'log')
drop_attribute('TestExecution', 'log')
drop_attribute('TestExecution', 'starttime')
drop_attribute('TestExecution', 'endtime')
drop_attribute('TestExecution', 'execution_status')
# coming from apycot
drop_attribute('Plan', 'execution_status')
drop_attribute('Plan', 'starttime')

drop_attribute('Plan', 'endtime')


sync_schema_props_perms()

# transform attribute to file
for plan_eid, log_file_eid in log_list:
        rql('SET X execution_log Y WHERE X eid %(x)s, Y eid %(y)s',
                dict(x=plan_eid, y=log_file_eid))

commit()


if confirm('Upgrade all log_files to file objects ?'):
    for xeid, fcontent in rset:
        if fcontent is not None:
            log_file = create_entity('File', data_name=u"log_file.txt",
                                     data=Binary(fcontent.encode('utf-8')),
                                     data_encoding='utf-8',
                                     ask_confirm=False)
            log_file.cw_set(reverse_log_file=xeid)
    commit()

from cubes.apycot import recipes
r_script_names = {
        u'apycot.recipe.quick': recipes.quick_script,
        u'apycot.recipe.full': recipes.full_script,
        u'apycot.recipe.scenario_runner': recipes.scenario_runner_script,
        }

warning_msg = """
### WARNING, THIS RECIPE SCRIPT WAS ADDED
### DURING A MIGRATION SCRIPT AND MIGHT
### NOT MATCH WHAT IS SHOWN IN THE LOGS
"""

for r_name in r_script_names:
    ## update script in recipe
    rql('SET X script %(script)s WHERE X is Recipe, X script "#to be updated", '
        'X name %(name)s',
        {'script': r_script_names[r_name],
         'name': r_name})
    ## update script for existing TestExecution and Chekresults
    rql('SET X script %(script)s WHERE X execution_of Y, Y name %(name)s',
        {'script': warning_msg + r_script_names[r_name],
         'name': r_name})

the apycot migration script (run after the narval script, links the Plan workflow to TestExecution):

from cubicweb import Binary
from cubicweb.server.session import Session

# some old instances may not have it
if not rql('CWConstraintType C WHERE C name "IntervalBoundConstraint"'):
    create_entity('CWConstraintType', name=u'IntervalBoundConstraint')

sync_schema_props_perms()

# Correct database
if 'apycot' in config.cubes():
    # Found buggy repo with both source_url and path at NULL
    buggy_repos = rql('Any R WHERE R source_url NULL, R path NULL')
    if buggy_repos and confirm('WARNING: %i repositories have '
                              'no "source_url" and no "path", '
                              'delete them ?' % len(buggy_repos)):
        rql('DELETE Repository R WHERE R source_url NULL, R path NULL')
    # Should have been done in vcreview migration 1.0.0
    buggy_repos_2 = rql('Any R WHERE R has_review NULL')
    if buggy_repos_2 and confirm('WARNING: %i repositories have no "has_review" '
                                'attribute, correct to False ?' % len(buggy_repos_2)):
        rql('SET R has_review False WHERE R has_review NULL')
    commit()

drop_attribute('Plan', 'arguments')
add_attribute('Recipe', 'script')
add_attribute('Plan', 'script')

drop_entity_type('RecipeStep')
drop_entity_type('RecipeTransition')
drop_entity_type('NarvalConditionExpression')
drop_entity_type('RecipeTransitionCondition')
drop_entity_type('RecipeStepInput')
drop_entity_type('RecipeStepOutput')

for rtype in ('in_recipe', 'start_step', 'end_step', 'in_steps', 'in_error_steps', 'out_steps', 'conditions', 'takes_input', 'generates_output', 'matches_condition'):
    if rtype not in fsschema:
        drop_relation_type(rtype)

if not rql('Any X, WF WHERE X name "Plan", WF workflow_of X'):
    make_workflowable('Plan')

    wf = add_workflow(u'Plan execution status workflow', 'Plan')

    ready = wf.add_state(_('ready'), initial=True)
    running = wf.add_state(_('running'))
    done = wf.add_state(_('done'))
    error = wf.add_state(_('error'), description=_('The execution of the Plan has been terminated (due to unexpected error)'))
    killed = wf.add_state(_('killed'), description=_('The execution of the Plan has been terminated (due to resource exhaustion)'))
    wf.add_transition(_('start'), ready, running, requiredgroups=('narval',))
    wf.add_transition(_('end'), running, done, requiredgroups=('narval',))
    wf.add_transition(_('fail'), running, error, requiredgroups=('narval',))
    wf.add_transition(_('kill'), running, killed, requiredgroups=('narval',))

install_custom_sql_scripts('narval')

if 'file' not in config.cubes():
    add_cube('file')

log_list = []

rql('SET X script "#to be updated" WHERE X is Recipe, X script NULL')

if confirm('Upgrade all execution_logs to file objects ()?'):
    for plan in rql('Any X WHERE X is Plan').entities():
        if plan.execution_log is not None:
            log_file = create_entity('File', data_name=u"execution_log.txt",
                    data=Binary(plan.execution_log.encode('utf-8')),
                    data_encoding='utf-8',
                    ask_confirm=False)
            log_list.append([plan.eid, log_file.eid])

drop_attribute('Plan', 'execution_log')
drop_relation_type('execution_log')
add_relation_type('execution_log')

sync_schema_props_perms()

# transform attribute to file
for plan_eid, log_file_eid in log_list:
        rql('SET X execution_log Y WHERE X eid %(x)s, Y eid %(y)s',
                {'x': plan_eid, 'y': log_file_eid})

commit()
### XXX Cannot do apycot migration without commenting these
drop_attribute('Plan', 'execution_status')
drop_attribute('Plan', 'starttime')

drop_attribute('Plan', 'endtime')
priorityimportant
typebug
done in3.18.4
load1.000
load left0.000
closed by#fe267b7336f3 [migration] always rebuild infered relation
patch[migration] always rebuild infered relation [applied]