Changeset 12883 for main


Ignore:
Timestamp:
25 Apr 2015, 06:13:18 (10 years ago)
Author:
Henrik Bettermann
Message:

More docs.

Location:
main/waeup.kofa/trunk
Files:
6 edited

Legend:

Unmodified
Added
Removed
  • main/waeup.kofa/trunk/docs/source/userdocs/datacenter/export.rst

    r12870 r12883  
    2424.. note::
    2525
    26 The list of exported columns usually underlies heavy customizations.
    27 In the Kofa base package only very few columns are being exported. In
    28 some Kofa custom packages tons of data are being gathered from
    29 applicants and students and the number of columns increase accordingly.
     26  The list of exported columns usually underlies heavy customizations.
     27  In the Kofa base package only very few columns are being exported. In
     28  some Kofa custom packages tons of data are being gathered from
     29  applicants and students and the number of columns increase accordingly.
    3030
    3131The `title` attribute unveils the name of the exporter under which
     
    173173Student Data Exporter can be further configured through a
    174174configuration page. Search parameters like the student's current level,
    175  current session and current study mode can be set to filter sets of
     175current session and current study mode can be set to filter sets of
    176176students in order to decrease the size of the export file. The set of
    177177filter parameters varies and depends on the 'locatation' from where
  • main/waeup.kofa/trunk/docs/source/userdocs/datacenter/import_processors.rst

    r12882 r12883  
    44****************
    55
    6 All batch processors inherit from the :py:class:`waeup.kofa.utils.batching.BatchProcessor` base class. The `doImport` method, described above, always remains unchanged. All processors have a property `available_fields` which defines the set of importable data. They correspond with the column titles of the import file. Available fields are usually composed of location fields, interface fields and additional fields. Overlaps are possible. Location fields define the minumum set of fields which are necessary to locate an existing object in order to update or remove it. Interface fields (schema fields) are the fields defined in the interface of the data entity. Additional fields are additionally needed for data processing. We further distinguish between required and optional fields or between schema and non-schema fields.
     6All batch processors inherit from the
     7:py:class:`waeup.kofa.utils.batching.BatchProcessor` base class. The
     8`doImport` method, described above, always remains unchanged. All
     9processors have a property `available_fields` which defines the set
     10of importable data. They correspond with the column titles of the
     11import file. Available fields are usually composed of location fields,
     12 interface fields and additional fields. Overlaps are possible.
     13Location fields define the minumum set of fields which are necessary
     14to locate an existing object in order to update or remove it.
     15Interface fields (schema fields) are the fields defined in the
     16interface of the data entity. Additional fields are additionally
     17needed for data processing. We further distinguish between required
     18and optional fields or between schema and non-schema fields.
    719
    8 In the following we list all available processors of the Kofa base package including some important methods which describe them best. We do not list available fields of each processor here. Available fields are shown in the browser user interface on the upload page of the portal.
     20In the following we list all available processors of the Kofa base
     21package including some important methods which describe them best. We
     22do not list available fields of each processor here. Available fields
     23are shown in the browser user interface on the upload page of the
     24portal.
    925
    1026Regular Processors
     
    4157  :noindex:
    4258
    43 
    4459Certificate Course Processor
    4560----------------------------
    4661
    4762.. autoclass:: waeup.kofa.university.batching.CertificateCourseProcessor()
     63  :noindex:
     64
     65Access Code Batch Processor
     66---------------------------
     67
     68.. autoclass:: waeup.kofa.accesscodes.batching.AccessCodeBatchProcessor()
     69  :noindex:
     70
     71Access Code Processor
     72---------------------
     73
     74.. autoclass:: waeup.kofa.accesscodes.batching.AccessCodeProcessor()
     75  :noindex:
     76
     77Hostel Processor
     78----------------
     79
     80.. autoclass:: waeup.kofa.hostels.batching.HostelProcessor()
    4881  :noindex:
    4982
     
    86119.. note::
    87120
    88   The student data processors described so far are mainly intended for restoring data. If the portal is operated correctly and without interruption and students follow the workflow from their first to the final study year, there is no need to use the above batch processors to maintain the data. The processors are not part of the student registration management. The following processors can or sometimes even must be integrated into the regular management of student data. Scores have to be imported, new payment tickets have to created, the verdicts have to be set or workflow transitions have to be triggered.
     121  The student data processors described so far are mainly intended for
     122  restoring data. If the portal is operated correctly and without
     123  interruption and students follow the workflow from their first to the
     124  final study year, there is no need to use the above batch processors
     125  to maintain the data. The processors are not part of the student
     126  registration management. The following processors can or sometimes
     127  even must be integrated into the regular management of student data.
     128  Scores have to be imported, new payment tickets have to created, the
     129  verdicts have to be set or workflow transitions have to be triggered.
    89130
    90131Course Ticket Processor
  • main/waeup.kofa/trunk/docs/source/userdocs/datacenter/import_stages.rst

    r12872 r12883  
    7777the next step (**import step 4**).
    7878
     79.. important::
     80
     81  Data center managers, who are only charged with uploading files but
     82  not with the import of files, are requested to proceed up to import step 3
     83  and verify that the data format meets all the import criteria and
     84  requirements of the batch processor.
     85
    7986Stage 3: Data Validation and Import
    8087===================================
  • main/waeup.kofa/trunk/src/waeup/kofa/accesscodes/batching.py

    r9706 r12883  
    3434
    3535class AccessCodeBatchProcessor(BatchProcessor):
    36     """A batch processor for IAccessCodeBatch objects.
     36    """The Access Code Batch Processor imports containers for access codes.
     37    It does not import their content. There is nothing special about this
     38    processor.
    3739    """
    3840    grok.implements(IBatchProcessor)
     
    8587
    8688class AccessCodeProcessor(BatchProcessor):
    87     """A batch processor for IAccessCode objects.
     89    """The Access Code Processor imports access codes (ac) into their
     90    batches. Localization requires the `representation` of the ac
     91    (object id) as well as  `batch_prefix` and `batch_num` to find
     92    the parent container (the batch).
     93
     94    Access codes follow a workflow. The `checkConversion` method validates
     95    the workflow `state` and `transition` in row. `checkUpdateRequirements`
     96    checks if the transition is allowed. This depends on the context.
    8897    """
    8998    grok.implements(IBatchProcessor)
     
    136145    def addEntry(self, obj, row, site):
    137146        parent = self.getParent(row, site)
    138         obj.batch_serial = row['batch_serial']
    139         obj.random_num = row['random_num']
    140147        parent[row['representation']] = obj
    141148        return
  • main/waeup.kofa/trunk/src/waeup/kofa/hostels/batching.py

    r11891 r12883  
    2727
    2828class HostelProcessor(BatchProcessor):
    29     """A batch processor for IHostel objects.
     29    """The Hostel Procesor imports hostels, i.e. the container objects of
     30    beds. It does not import beds. There is nothing special about this
     31    processor.
    3032    """
    3133    grok.implements(IBatchProcessor)
  • main/waeup.kofa/trunk/src/waeup/kofa/students/batching.py

    r12882 r12883  
    296296        errs, inv_errs, conv_dict =  converter.fromStringDict(
    297297            row, self.factory_name, mode=mode)
     298        # We cannot import both state and transition.
     299        if 'transition' in row and 'state' in row:
     300            if row['transition'] not in (IGNORE_MARKER, '') and \
     301                row['state'] not in (IGNORE_MARKER, ''):
     302                errs.append(('workflow','not allowed'))
     303                return errs, inv_errs, conv_dict
    298304        if 'transition' in row:
    299305            if row['transition'] not in IMPORTABLE_TRANSITIONS:
Note: See TracChangeset for help on using the changeset viewer.