36 resultados para COCCOLITHOPHORE BLOOM
Resumo:
-
Resumo:
This study investigates a way to systematically integrate information literacy (IL) into an undergraduate academic programme and develops a model for integrating information literacy across higher education curricula. Curricular integration of information literacy in this study means weaving information literacy into an academic curriculum. In the associated literature, it is also referred to as the information literacy embedding approach or the intra-curricular approach. The key findings identified from this study are presented in 4 categories: the characteristics of IL integration; the key stakeholders in IL integration; IL curricular design strategies; and the process of IL curricular integration. Three key characteristics of the curricular integration of IL are identified: collaboration and negotiation, contextualisation and ongoing interaction with information. The key stakeholders in the curricular integration of IL are recognised as the librarians, the course coordinators and lecturers, the heads of faculties or departments, and the students. Some strategies for IL curricular design include: the use of IL policies and standards in IL curricular design; the combination of face to face and online teaching as an emerging trend; the use of IL assessment tools which play an important role in IL integration. IL can be integrated into the intended curriculum (what an institution expects its students to learn), the offered curriculum (what the teachers teach) and the received curriculum (what students actually learn). IL integration is a process of negotiation, collaboration and the implementation of the intended curriculum. IL can be integrated at different levels of curricula such as: institutional, faculty, departmental, course and class curriculum levels. Based on these key findings, an IL curricular integration model is developed. The model integrates curriculum, pedagogy and learning theories, IL theories, IL guidelines and the collaboration of multiple partners. The model provides a practical approach to integrating IL into multiple courses across an academic degree. The development of the model was based on the IL integration experiences of various disciplines in three universities and the implementation experience of an engineering programme at another university; thus it may be of interest to other disciplines. The model has the potential to enhance IL teaching and learning, curricular development and to implement graduate attributes in higher education. Sociocultural theories are applied to the research process and IL curricular design of this study. Sociocultural theories describe learning as being embedded within social events and occurring as learners interact with other people, objects, and events in a collaborative environment. Sociocultural theories are applied to explore how academic staff and librarians experience the curricular integration of IL; they also support collaboration in the curricular integration of IL and the development of an IL integration model. This study consists of two phases. Phase I (2007) was the interview phase where both academic staff and librarians at three IL active universities were interviewed. During this phase, attention was paid specifically to the practical process of curricular integration of IL and IL activity design. Phase II, the development phase (2007-2008), was conducted at a fourth university. This phase explores the systematic integration of IL into an engineering degree from Year 1 to Year 4. Learning theories such as sociocultural theories, Bloom’s Taxonomy and IL theories are used in IL curricular development. Based on the findings from both phases, an IL integration model was developed. The findings and the model contribute to IL education, research and curricular development in higher education. The sociocultural approach adopted in this study also extends the application of sociocultural theories to the IL integration process and curricular design in higher education.
Resumo:
‘Explosive Revelations’ employs the device of the Hollywood-style explosion to expose the constructed and futile nature of the moving image. Pointless, impotent explosions bloom and fade, punctuating a non-existent narrative – they promise the spectacle of violence but destroy nothing and disappear without a trace. The video itself is sourced from a stock footage supplier that provides users with a selection of explosions that can be inserted into movies by masking out the background. However, the footage is not used as intended, leaving them instead as merely explosions erupting on top of a black background, fizzling out into non-existence. The work was included in the 2008 'Light in Winter' program at Federation Square, Melbourne, directed by Robyn Archer.
Resumo:
EMR (Electronic Medical Record) is an emerging technology that is highly-blended between non-IT and IT area. One methodology is to link the non-IT and IT area is to construct databases. Nowadays, it supports before and after-treatment for patients and should satisfy all stakeholders such as practitioners, nurses, researchers, administrators and financial departments and so on. In accordance with the database maintenance, DAS (Data as Service) model is one solution for outsourcing. However, there are some scalability and strategy issues when we need to plan to use DAS model properly. We constructed three kinds of databases such as plan-text, MS built-in encryption which is in-house model and custom AES (Advanced Encryption Standard) - DAS model scaling from 5K to 2560K records. To perform custom AES-DAS better, we also devised Bucket Index using Bloom Filter. The simulation showed the response times arithmetically increased in the beginning but after a certain threshold, exponentially increased in the end. In conclusion, if the database model is close to in-house model, then vendor technology is a good way to perform and get query response times in a consistent manner. If the model is DAS model, it is easy to outsource the database, however, some techniques like Bucket Index enhances its utilization. To get faster query response times, designing database such as consideration of the field type is also important. This study suggests cloud computing would be a next DAS model to satisfy the scalability and the security issues.
Resumo:
Electronic Health Record (EHR) retrieval processes are complex demanding Information Technology (IT) resources exponentially in particular memory usage. Database-as-a-service (DAS) model approach is proposed to meet the scalability factor of EHR retrieval processes. A simulation study using ranged of EHR records with DAS model was presented. The bucket-indexing model incorporated partitioning fields and bloom filters in a Singleton design pattern were used to implement custom database encryption system. It effectively provided faster responses in the range query compared to different types of queries used such as aggregation queries among the DAS, built-in encryption and the plain-text DBMS. The study also presented with constraints around the approach should consider for other practical applications.
Resumo:
In the recent past, there are some social issues when personal sensitive data in medical database were exposed. The personal sensitive data should be protected and access must be accounted for. Protecting the sensitive information is possible by encrypting such information. The challenge is querying the encrypted information when making the decision. Encrypted query is practically somewhat tedious task. So we present the more effective method using bucket index and bloom filter technology. We find that our proposed method shows low memory and fast efficiency comparatively. Simulation approaches on data encryption techniques to improve health care decision making processes are presented in this paper as a case scenario.
Resumo:
In the medical and healthcare arena, patients‟ data is not just their own personal history but also a valuable large dataset for finding solutions for diseases. While electronic medical records are becoming popular and are used in healthcare work places like hospitals, as well as insurance companies, and by major stakeholders such as physicians and their patients, the accessibility of such information should be dealt with in a way that preserves privacy and security. Thus, finding the best way to keep the data secure has become an important issue in the area of database security. Sensitive medical data should be encrypted in databases. There are many encryption/ decryption techniques and algorithms with regard to preserving privacy and security. Currently their performance is an important factor while the medical data is being managed in databases. Another important factor is that the stakeholders should decide more cost-effective ways to reduce the total cost of ownership. As an alternative, DAS (Data as Service) is a popular outsourcing model to satisfy the cost-effectiveness but it takes a consideration that the encryption/ decryption modules needs to be handled by trustworthy stakeholders. This research project is focusing on the query response times in a DAS model (AES-DAS) and analyses the comparison between the outsourcing model and the in-house model which incorporates Microsoft built-in encryption scheme in a SQL Server. This research project includes building a prototype of medical database schemas. There are 2 types of simulations to carry out the project. The first stage includes 6 databases in order to carry out simulations to measure the performance between plain-text, Microsoft built-in encryption and AES-DAS (Data as Service). Particularly, the AES-DAS incorporates implementations of symmetric key encryption such as AES (Advanced Encryption Standard) and a Bucket indexing processor using Bloom filter. The results are categorised such as character type, numeric type, range queries, range queries using Bucket Index and aggregate queries. The second stage takes the scalability test from 5K to 2560K records. The main result of these simulations is that particularly as an outsourcing model, AES-DAS using the Bucket index shows around 3.32 times faster than a normal AES-DAS under the 70 partitions and 10K record-sized databases. Retrieving Numeric typed data takes shorter time than Character typed data in AES-DAS. The aggregation query response time in AES-DAS is not as consistent as that in MS built-in encryption scheme. The scalability test shows that the DBMS reaches in a certain threshold; the query response time becomes rapidly slower. However, there is more to investigate in order to bring about other outcomes and to construct a secured EMR (Electronic Medical Record) more efficiently from these simulations.
Resumo:
Student performance on examinations is influenced by the level of difficulty of the questions. It seems reasonable to propose therefore that assessment of the difficulty of exam questions could be used to gauge the level of skills and knowledge expected at the end of a course. This paper reports the results of a study investigating the difficulty of exam questions using a subjective assessment of difficulty and a purpose-built exam question complexity classification scheme. The scheme, devised for exams in introductory programming courses, assesses the complexity of each question using six measures: external domain references, explicitness, linguistic complexity, conceptual complexity, length of code involved in the question and/or answer, and intellectual complexity (Bloom level). We apply the scheme to 20 introductory programming exam papers from five countries, and find substantial variation across the exams for all measures. Most exams include a mix of questions of low, medium, and high difficulty, although seven of the 20 have no questions of high difficulty. All of the complexity measures correlate with assessment of difficulty, indicating that the difficulty of an exam question relates to each of these more specific measures. We discuss the implications of these findings for the development of measures to assess learning standards in programming courses.
Resumo:
Educators are faced with many challenging questions in designing an effective curriculum. What prerequisite knowledge do students have before commencing a new subject? At what level of mastery? What is the spread of capabilities between bare-passing students vs. the top performing group? How does the intended learning specification compare to student performance at the end of a subject? In this paper we present a conceptual model that helps in answering some of these questions. It has the following main capabilities: capturing the learning specification in terms of syllabus topics and outcomes; capturing mastery levels to model progression; capturing the minimal vs. aspirational learning design; capturing confidence and reliability metrics for each of these mappings; and finally, comparing and reflecting on the learning specification against actual student performance. We present a web-based implementation of the model, and validate it by mapping the final exams from four programming subjects against the ACM/IEEE CS2013 topics and outcomes, using Bloom's Taxonomy as the mastery scale. We then import the itemised exam grades from 632 students across the four subjects and compare the demonstrated student performance against the expected learning for each of these. Key contributions of this work are the validated conceptual model for capturing and comparing expected learning vs. demonstrated performance, and a web-based implementation of this model, which is made freely available online as a community resource.
Resumo:
Database security techniques are available widely. Among those techniques, the encryption method is a well-certified and established technology for protecting sensitive data. However, once encrypted, the data can no longer be easily queried. The performance of the database depends on how to encrypt the sensitive data, and an approach for searching and retrieval efficiencies that are implemented. In this paper we analyze the database queries and the data properties and propose a suitable mechanism to query the encrypted database. We proposed and analyzed the new database encryption algorithm using the Bloom Filter with the bucket index method. Finally, we demonstrated the superiority of the proposed algorithm through several experiments that should be useful for database encryption related research and application activities.
Resumo:
The count-min sketch is a useful data structure for recording and estimating the frequency of string occurrences, such as passwords, in sub-linear space with high accuracy. However, it cannot be used to draw conclusions on groups of strings that are similar, for example close in Hamming distance. This paper introduces a variant of the count-min sketch which allows for estimating counts within a specified Hamming distance of the queried string. This variant can be used to prevent users from choosing popular passwords, like the original sketch, but it also allows for a more efficient method of analysing password statistics.
Resumo:
Big Data is a rising IT trend similar to cloud computing, social networking or ubiquitous computing. Big Data can offer beneficial scenarios in the e-health arena. However, one of the scenarios can be that Big Data needs to be kept secured for a long period of time in order to gain its benefits such as finding cures for infectious diseases and protecting patient privacy. From this connection, it is beneficial to analyse Big Data to make meaningful information while the data is stored securely. Therefore, the analysis of various database encryption techniques is essential. In this study, we simulated 3 types of technical environments, namely, Plain-text, Microsoft Built-in Encryption, and custom Advanced Encryption Standard, using Bucket Index in Data-as-a-Service. The results showed that custom AES-DaaS has a faster range query response time than MS built-in encryption. Furthermore, while carrying out the scalability test, we acknowledged that there are performance thresholds depending on physical IT resources. Therefore, for the purpose of efficient Big Data management in eHealth it is noteworthy to examine their scalability limits as well even if it is under a cloud computing environment. In addition, when designing an e-health database, both patient privacy and system performance needs to be dealt as top priorities.
Resumo:
Incorporating a learner’s level of cognitive processing into Learning Analytics presents opportunities for obtaining rich data on the learning process. We propose a framework called COPA that provides a basis for mapping levels of cognitive operation into a learning analytics system. We utilise Bloom’s taxonomy, a theoretically respected conceptualisation of cognitive processing, and apply it in a flexible structure that can be implemented incrementally and with varying degree of complexity within an educational organisation. We outline how the framework is applied, and its key benefits and limitations. Finally, we apply COPA to a University undergraduate unit, and demonstrate its utility in identifying key missing elements in the structure of the course.
Resumo:
Mooting is modeled principally on appellate advocacy. However, the skill set developed by participating in a moot program – being that necessary to persuade someone to your preferred position – is indispensible to anyone practising law. Developing effective mooting skills in students necessitates the engagement of coaches with an appropriate understanding of the theories underlying mooting and advocacy practice and their interconnection with each other. This article explains the relevance of the cognitive domain to mooting performance and places it in context with the psychomotor and affective domains.
Resumo:
In June 2011 a large phytoplankton bloom resulted in a catastrophic mortality event that affected a large coastal embayment in the Solomon Islands. This consisted of an area in excess of 20 km2 of reef and soft sandy habitats in Marovo Lagoon, the largest double barrier lagoon in the world. This embayment is home to over 1200 people leading largely subsistence lifestyles depending on the impacted reefs for majority of their protein needs. A toxic diatom (Psuedo-nitzchia spp.) and toxic dinoflagellate (Pyrodinium bahamense var. compressum) reached concentrations of millions of cells per litre. The senescent phytoplankton bloom led to complete de-oxygenation of the water column that reportedly caused substantial mortality of marine animal life in the immediate area within a rapid timeframe (24 h). Groups affected included holothurians, crabs and reef and pelagic fish species. Dolphins, reptiles and birds were also found dead within the area, indicating algal toxin accumulation in the food chain. Deep reefs and sediments, whilst initially unaffected, have now been blanketed in large cyanobacterial mats which have negatively impacted live coral cover especially within the deep reef zone (> 6 m depth). Reef recovery within the deep zone has been extremely slow and may indicate an alternative state for the system.