- Timestamp:
- 25 Apr 2015, 06:13:18 (10 years ago)
- Location:
- main/waeup.kofa/trunk
- Files:
-
- 6 edited
Legend:
- Unmodified
- Added
- Removed
-
main/waeup.kofa/trunk/docs/source/userdocs/datacenter/export.rst
r12870 r12883 24 24 .. note:: 25 25 26 The list of exported columns usually underlies heavy customizations.27 In the Kofa base package only very few columns are being exported. In28 some Kofa custom packages tons of data are being gathered from29 applicants and students and the number of columns increase accordingly.26 The list of exported columns usually underlies heavy customizations. 27 In the Kofa base package only very few columns are being exported. In 28 some Kofa custom packages tons of data are being gathered from 29 applicants and students and the number of columns increase accordingly. 30 30 31 31 The `title` attribute unveils the name of the exporter under which … … 173 173 Student Data Exporter can be further configured through a 174 174 configuration page. Search parameters like the student's current level, 175 175 current session and current study mode can be set to filter sets of 176 176 students in order to decrease the size of the export file. The set of 177 177 filter parameters varies and depends on the 'locatation' from where -
main/waeup.kofa/trunk/docs/source/userdocs/datacenter/import_processors.rst
r12882 r12883 4 4 **************** 5 5 6 All batch processors inherit from the :py:class:`waeup.kofa.utils.batching.BatchProcessor` base class. The `doImport` method, described above, always remains unchanged. All processors have a property `available_fields` which defines the set of importable data. They correspond with the column titles of the import file. Available fields are usually composed of location fields, interface fields and additional fields. Overlaps are possible. Location fields define the minumum set of fields which are necessary to locate an existing object in order to update or remove it. Interface fields (schema fields) are the fields defined in the interface of the data entity. Additional fields are additionally needed for data processing. We further distinguish between required and optional fields or between schema and non-schema fields. 6 All batch processors inherit from the 7 :py:class:`waeup.kofa.utils.batching.BatchProcessor` base class. The 8 `doImport` method, described above, always remains unchanged. All 9 processors have a property `available_fields` which defines the set 10 of importable data. They correspond with the column titles of the 11 import file. Available fields are usually composed of location fields, 12 interface fields and additional fields. Overlaps are possible. 13 Location fields define the minumum set of fields which are necessary 14 to locate an existing object in order to update or remove it. 15 Interface fields (schema fields) are the fields defined in the 16 interface of the data entity. Additional fields are additionally 17 needed for data processing. We further distinguish between required 18 and optional fields or between schema and non-schema fields. 7 19 8 In the following we list all available processors of the Kofa base package including some important methods which describe them best. We do not list available fields of each processor here. Available fields are shown in the browser user interface on the upload page of the portal. 20 In the following we list all available processors of the Kofa base 21 package including some important methods which describe them best. We 22 do not list available fields of each processor here. Available fields 23 are shown in the browser user interface on the upload page of the 24 portal. 9 25 10 26 Regular Processors … … 41 57 :noindex: 42 58 43 44 59 Certificate Course Processor 45 60 ---------------------------- 46 61 47 62 .. autoclass:: waeup.kofa.university.batching.CertificateCourseProcessor() 63 :noindex: 64 65 Access Code Batch Processor 66 --------------------------- 67 68 .. autoclass:: waeup.kofa.accesscodes.batching.AccessCodeBatchProcessor() 69 :noindex: 70 71 Access Code Processor 72 --------------------- 73 74 .. autoclass:: waeup.kofa.accesscodes.batching.AccessCodeProcessor() 75 :noindex: 76 77 Hostel Processor 78 ---------------- 79 80 .. autoclass:: waeup.kofa.hostels.batching.HostelProcessor() 48 81 :noindex: 49 82 … … 86 119 .. note:: 87 120 88 The student data processors described so far are mainly intended for restoring data. If the portal is operated correctly and without interruption and students follow the workflow from their first to the final study year, there is no need to use the above batch processors to maintain the data. The processors are not part of the student registration management. The following processors can or sometimes even must be integrated into the regular management of student data. Scores have to be imported, new payment tickets have to created, the verdicts have to be set or workflow transitions have to be triggered. 121 The student data processors described so far are mainly intended for 122 restoring data. If the portal is operated correctly and without 123 interruption and students follow the workflow from their first to the 124 final study year, there is no need to use the above batch processors 125 to maintain the data. The processors are not part of the student 126 registration management. The following processors can or sometimes 127 even must be integrated into the regular management of student data. 128 Scores have to be imported, new payment tickets have to created, the 129 verdicts have to be set or workflow transitions have to be triggered. 89 130 90 131 Course Ticket Processor -
main/waeup.kofa/trunk/docs/source/userdocs/datacenter/import_stages.rst
r12872 r12883 77 77 the next step (**import step 4**). 78 78 79 .. important:: 80 81 Data center managers, who are only charged with uploading files but 82 not with the import of files, are requested to proceed up to import step 3 83 and verify that the data format meets all the import criteria and 84 requirements of the batch processor. 85 79 86 Stage 3: Data Validation and Import 80 87 =================================== -
main/waeup.kofa/trunk/src/waeup/kofa/accesscodes/batching.py
r9706 r12883 34 34 35 35 class AccessCodeBatchProcessor(BatchProcessor): 36 """A batch processor for IAccessCodeBatch objects. 36 """The Access Code Batch Processor imports containers for access codes. 37 It does not import their content. There is nothing special about this 38 processor. 37 39 """ 38 40 grok.implements(IBatchProcessor) … … 85 87 86 88 class AccessCodeProcessor(BatchProcessor): 87 """A batch processor for IAccessCode objects. 89 """The Access Code Processor imports access codes (ac) into their 90 batches. Localization requires the `representation` of the ac 91 (object id) as well as `batch_prefix` and `batch_num` to find 92 the parent container (the batch). 93 94 Access codes follow a workflow. The `checkConversion` method validates 95 the workflow `state` and `transition` in row. `checkUpdateRequirements` 96 checks if the transition is allowed. This depends on the context. 88 97 """ 89 98 grok.implements(IBatchProcessor) … … 136 145 def addEntry(self, obj, row, site): 137 146 parent = self.getParent(row, site) 138 obj.batch_serial = row['batch_serial']139 obj.random_num = row['random_num']140 147 parent[row['representation']] = obj 141 148 return -
main/waeup.kofa/trunk/src/waeup/kofa/hostels/batching.py
r11891 r12883 27 27 28 28 class HostelProcessor(BatchProcessor): 29 """A batch processor for IHostel objects. 29 """The Hostel Procesor imports hostels, i.e. the container objects of 30 beds. It does not import beds. There is nothing special about this 31 processor. 30 32 """ 31 33 grok.implements(IBatchProcessor) -
main/waeup.kofa/trunk/src/waeup/kofa/students/batching.py
r12882 r12883 296 296 errs, inv_errs, conv_dict = converter.fromStringDict( 297 297 row, self.factory_name, mode=mode) 298 # We cannot import both state and transition. 299 if 'transition' in row and 'state' in row: 300 if row['transition'] not in (IGNORE_MARKER, '') and \ 301 row['state'] not in (IGNORE_MARKER, ''): 302 errs.append(('workflow','not allowed')) 303 return errs, inv_errs, conv_dict 298 304 if 'transition' in row: 299 305 if row['transition'] not in IMPORTABLE_TRANSITIONS:
Note: See TracChangeset for help on using the changeset viewer.