Changeset 12883 for main/waeup.kofa/trunk/docs/source/userdocs/datacenter
- Timestamp:
- 25 Apr 2015, 06:13:18 (10 years ago)
- Location:
- main/waeup.kofa/trunk/docs/source/userdocs/datacenter
- Files:
-
- 3 edited
Legend:
- Unmodified
- Added
- Removed
-
main/waeup.kofa/trunk/docs/source/userdocs/datacenter/export.rst
r12870 r12883 24 24 .. note:: 25 25 26 The list of exported columns usually underlies heavy customizations.27 In the Kofa base package only very few columns are being exported. In28 some Kofa custom packages tons of data are being gathered from29 applicants and students and the number of columns increase accordingly.26 The list of exported columns usually underlies heavy customizations. 27 In the Kofa base package only very few columns are being exported. In 28 some Kofa custom packages tons of data are being gathered from 29 applicants and students and the number of columns increase accordingly. 30 30 31 31 The `title` attribute unveils the name of the exporter under which … … 173 173 Student Data Exporter can be further configured through a 174 174 configuration page. Search parameters like the student's current level, 175 175 current session and current study mode can be set to filter sets of 176 176 students in order to decrease the size of the export file. The set of 177 177 filter parameters varies and depends on the 'locatation' from where -
main/waeup.kofa/trunk/docs/source/userdocs/datacenter/import_processors.rst
r12882 r12883 4 4 **************** 5 5 6 All batch processors inherit from the :py:class:`waeup.kofa.utils.batching.BatchProcessor` base class. The `doImport` method, described above, always remains unchanged. All processors have a property `available_fields` which defines the set of importable data. They correspond with the column titles of the import file. Available fields are usually composed of location fields, interface fields and additional fields. Overlaps are possible. Location fields define the minumum set of fields which are necessary to locate an existing object in order to update or remove it. Interface fields (schema fields) are the fields defined in the interface of the data entity. Additional fields are additionally needed for data processing. We further distinguish between required and optional fields or between schema and non-schema fields. 6 All batch processors inherit from the 7 :py:class:`waeup.kofa.utils.batching.BatchProcessor` base class. The 8 `doImport` method, described above, always remains unchanged. All 9 processors have a property `available_fields` which defines the set 10 of importable data. They correspond with the column titles of the 11 import file. Available fields are usually composed of location fields, 12 interface fields and additional fields. Overlaps are possible. 13 Location fields define the minumum set of fields which are necessary 14 to locate an existing object in order to update or remove it. 15 Interface fields (schema fields) are the fields defined in the 16 interface of the data entity. Additional fields are additionally 17 needed for data processing. We further distinguish between required 18 and optional fields or between schema and non-schema fields. 7 19 8 In the following we list all available processors of the Kofa base package including some important methods which describe them best. We do not list available fields of each processor here. Available fields are shown in the browser user interface on the upload page of the portal. 20 In the following we list all available processors of the Kofa base 21 package including some important methods which describe them best. We 22 do not list available fields of each processor here. Available fields 23 are shown in the browser user interface on the upload page of the 24 portal. 9 25 10 26 Regular Processors … … 41 57 :noindex: 42 58 43 44 59 Certificate Course Processor 45 60 ---------------------------- 46 61 47 62 .. autoclass:: waeup.kofa.university.batching.CertificateCourseProcessor() 63 :noindex: 64 65 Access Code Batch Processor 66 --------------------------- 67 68 .. autoclass:: waeup.kofa.accesscodes.batching.AccessCodeBatchProcessor() 69 :noindex: 70 71 Access Code Processor 72 --------------------- 73 74 .. autoclass:: waeup.kofa.accesscodes.batching.AccessCodeProcessor() 75 :noindex: 76 77 Hostel Processor 78 ---------------- 79 80 .. autoclass:: waeup.kofa.hostels.batching.HostelProcessor() 48 81 :noindex: 49 82 … … 86 119 .. note:: 87 120 88 The student data processors described so far are mainly intended for restoring data. If the portal is operated correctly and without interruption and students follow the workflow from their first to the final study year, there is no need to use the above batch processors to maintain the data. The processors are not part of the student registration management. The following processors can or sometimes even must be integrated into the regular management of student data. Scores have to be imported, new payment tickets have to created, the verdicts have to be set or workflow transitions have to be triggered. 121 The student data processors described so far are mainly intended for 122 restoring data. If the portal is operated correctly and without 123 interruption and students follow the workflow from their first to the 124 final study year, there is no need to use the above batch processors 125 to maintain the data. The processors are not part of the student 126 registration management. The following processors can or sometimes 127 even must be integrated into the regular management of student data. 128 Scores have to be imported, new payment tickets have to created, the 129 verdicts have to be set or workflow transitions have to be triggered. 89 130 90 131 Course Ticket Processor -
main/waeup.kofa/trunk/docs/source/userdocs/datacenter/import_stages.rst
r12872 r12883 77 77 the next step (**import step 4**). 78 78 79 .. important:: 80 81 Data center managers, who are only charged with uploading files but 82 not with the import of files, are requested to proceed up to import step 3 83 and verify that the data format meets all the import criteria and 84 requirements of the batch processor. 85 79 86 Stage 3: Data Validation and Import 80 87 ===================================
Note: See TracChangeset for help on using the changeset viewer.