856 resultados para Tilting and cotilting modules
Resumo:
One of the leading motivations behind the multilingual semantic web is to make resources accessible digitally in an online global multilingual context. Consequently, it is fundamental for knowledge bases to find a way to manage multilingualism and thus be equipped with those procedures for its conceptual modelling. In this context, the goal of this paper is to discuss how common-sense knowledge and cultural knowledge are modelled in a multilingual framework. More particularly, multilingualism and conceptual modelling are dealt with from the perspective of FunGramKB, a lexico-conceptual knowledge base for natural language understanding. This project argues for a clear division between the lexical and the conceptual dimensions of knowledge. Moreover, the conceptual layer is organized into three modules, which result from a strong commitment towards capturing semantic knowledge (Ontology), procedural knowledge (Cognicon) and episodic knowledge (Onomasticon). Cultural mismatches are discussed and formally represented at the three conceptual levels of FunGramKB.
Resumo:
Following and contributing to the ongoing shift from more structuralist, system-oriented to more pragmatic, socio-cultural oriented anglicism research, this paper verifies to what extent the global spread of English affects naming patterns in Flanders. To this end, a diachronic database of first names is constructed, containing the top 75 most popular boy and girl names from 2005 until 2014. In a first step, the etymological background of these names is documented and the evolution in popularity of the English names in the database is tracked. Results reveal no notable surge in the preference for English names. This paper complements these database-driven results with an experimental study, aiming to show how associations through referents are in this case more telling than associations through phonological form (here based on etymology). Focusing on the socio-cultural background of first names in general and of Anglo-American pop culture in particular, the second part of the study specifically reports on results from a survey where participants are asked to name the first three celebrities that leap to mind when hearing a certain first name (e.g. Lana, triggering the response Del Rey). Very clear associations are found between certain first names and specific celebrities from Anglo-American pop culture. Linking back to marketing research and the social turn in onomastics, we will discuss how these celebrities might function as referees, and how social stereotypes surrounding these referees are metonymically attached to their first names. Similar to the country-of-origin-effect in marketing, these metonymical links could very well be the reason why parents select specific “celebrity names”. Although further attitudinal research is needed, this paper supports the importance of including socio-cultural parameters when conducting onomastic research.
Resumo:
For the official publication, see: http://dx.doi.org/10.1016/j.lindif.2016.06.021
Resumo:
Intelligent Tutoring Systems (ITSs) are computerized systems for learning-by-doing. These systems provide students with immediate and customized feedback on learning tasks. An ITS typically consists of several modules that are connected to each other. This research focuses on the distribution of the ITS module that provides expert knowledge services. For the distribution of such an expert knowledge module we need to use an architectural style because this gives a standard interface, which increases the reusability and operability of the expert knowledge module. To provide expert knowledge modules in a distributed way we need to answer the research question: ‘How can we compare and evaluate REST, Web services and Plug-in architectural styles for the distribution of the expert knowledge module in an intelligent tutoring system?’. We present an assessment method for selecting an architectural style. Using the assessment method on three architectural styles, we selected the REST architectural style as the style that best supports the distribution of expert knowledge modules. With this assessment method we also analyzed the trade-offs that come with selecting REST. We present a prototype and architectural views based on REST to demonstrate that the assessment method correctly scores REST as an appropriate architectural style for the distribution of expert knowledge modules.
Resumo:
The BlackEnergy malware targeting critical infrastructures has a long history. It evolved over time from a simple DDoS platform to a quite sophisticated plug-in based malware. The plug-in architecture has a persistent malware core with easily installable attack specific modules for DDoS, spamming, info-stealing, remote access, boot-sector formatting etc. BlackEnergy has been involved in several high profile cyber physical attacks including the recent Ukraine power grid attack in December 2015. This paper investigates the evolution of BlackEnergy and its cyber attack capabilities. It presents a basic cyber attack model used by BlackEnergy for targeting industrial control systems. In particular, the paper analyzes cyber threats of BlackEnergy for synchrophasor based systems which are used for real-time control and monitoring functionalities in smart grid. Several BlackEnergy based attack scenarios have been investigated by exploiting the vulnerabilities in two widely used synchrophasor communication standards: (i) IEEE C37.118 and (ii) IEC 61850-90-5. Specifically, the paper addresses reconnaissance, DDoS, man-in-the-middle and replay/reflection attacks on IEEE C37.118 and IEC 61850-90-5. Further, the paper also investigates protection strategies for detection and prevention of BlackEnergy based cyber physical attacks.
Resumo:
Let L be a unital Z-graded ring, and let C be a bounded chain complex of finitely generated L-modules. We give a homological characterisation of when C is homotopy equivalent to a bounded complex of finitely generated projective L0-modules, generalising known results for twisted Laurent polynomial rings. The crucial hypothesis is that L is a strongly graded ring.
Resumo:
The main goal of this thesis is to discuss the determination of homological invariants of polynomial ideals. Thereby we consider different coordinate systems and analyze their meaning for the computation of certain invariants. In particular, we provide an algorithm that transforms any ideal into strongly stable position if char k = 0. With a slight modification, this algorithm can also be used to achieve a stable or quasi-stable position. If our field has positive characteristic, the Borel-fixed position is the maximum we can obtain with our method. Further, we present some applications of Pommaret bases, where we focus on how to directly read off invariants from this basis. In the second half of this dissertation we take a closer look at another homological invariant, namely the (absolute) reduction number. It is a known fact that one immediately receives the reduction number from the basis of the generic initial ideal. However, we show that it is not possible to formulate an algorithm – based on analyzing only the leading ideal – that transforms an ideal into a position, which allows us to directly receive this invariant from the leading ideal. So in general we can not read off the reduction number of a Pommaret basis. This result motivates a deeper investigation of which properties a coordinate system must possess so that we can determine the reduction number easily, i.e. by analyzing the leading ideal. This approach leads to the introduction of some generalized versions of the mentioned stable positions, such as the weakly D-stable or weakly D-minimal stable position. The latter represents a coordinate system that allows to determine the reduction number without any further computations. Finally, we introduce the notion of β-maximal position, which provides lots of interesting algebraic properties. In particular, this position is in combination with weakly D-stable sufficient for the weakly D-minimal stable position and so possesses a connection to the reduction number.
Resumo:
[EN]In this paper, a basic conceptual architecture aimed at the design of Computer Vision System is qualitatively described. The proposed architecture addresses the design of vision systems in a modular fashion using modules with three distinct units or components: a processing network or diagnostics unit, a control unit and a communications unit. The control of the system at the modules level is designed based on a Discrete Events Model. This basic methodology has been used to design a realtime active vision system for detection, tracking and recognition of people. It is made up of three functional modules aimed at the detection, tracking, recognition of moving individuals plus a supervision module.
Resumo:
In industrial plants, oil and oil compounds are usually transported by closed pipelines with circular cross-section. The use of radiotracers in oil transport and processing industrial facilities allows calibrating flowmeters, measuring mean residence time in cracking columns, locate points of obstruction or leak in underground ducts, as well as investigating flow behavior or industrial processes such as in distillation towers. Inspection techniques using radiotracers are non-destructive, simple, economic and highly accurate. Among them, Total Count, which uses a small amount of radiotracer with known activity, is acknowledged as an absolute technique for flow rate measurement. A viscous fluid transport system, composed by four PVC pipelines with 13m length (12m horizontal and 1m vertical) and ½, ¾, 1 and 2-inch gauges, respectively, interconnected by maneuvering valves was designed and assembled in order to conduct the research. This system was used to simulate different flow conditions of petroleum compounds and for experimental studies of flow profile in the horizontal and upward directions. As 198Au presents a single photopeak (411,8 keV), it was the radioisotope chosen for oil labeling, in small amounts (6 ml) or around 200 kBq activity, and it was injected in the oil transport lines. A NaI scintillation detector 2”x 2”, with well-defined geometry, was used to measure total activity, determine the calibration factor F and, positioned after a homogenization distance and interconnected to a standardized electronic set of nuclear instrumentation modules (NIM), to detect the radioactive cloud.
Resumo:
This study had three objectives: (1) to develop a comprehensive truck simulation that executes rapidly, has a modular program construction to allow variation of vehicle characteristics, and is able to realistically predict vehicle motion and the tire-road surface interaction forces; (2) to develop a model of doweled portland cement concrete pavement that can be used to determine slab deflection and stress at predetermined nodes, and that allows for the variation of traditional thickness design factors; and (3) to implement these two models on a work station with suitable menu driven modules so that both existing and proposed pavements can be evaluated with respect to design life, given specific characteristics of the heavy vehicles that will be using the facility. This report summarizes the work that has been performed during the first year of the study. Briefly, the following has been accomplished: A two dimensional model of a typical 3-S2 tractor-trailer combination was created. A finite element structural analysis program, ANSYS, was used to model the pavement. Computer runs have been performed varying the parameters defining both vehicle and road elements. The resulting time specific displacements for each node are plotted, and the displacement basin is generated for defined vehicles. Relative damage to the pavement can then be estimated. A damage function resulting from load replications must be assumed that will be reflected by further pavement deterioration. Comparison with actual damage on Interstate 80 will eventually allow verification of these procedures.
Resumo:
Computer-based simulation games (CSG) are a form of innovation in learning and teaching. CGS are used more pervasively in various ways such as a class activity (formative exercises) and as part of summative assessments (Leemkuil and De Jong, 2012; Zantow et al., 2005). This study investigates the current and potential use of CGS in Worcester Business School’s (WBS) Business Management undergraduate programmes. The initial survey of off-the-shelf simulation reveals that there are various categories of simulations, with each offering varying levels of complexity and learning opportunities depending on the field of study. The findings suggest that whilst there is marginal adoption of the use CSG in learning and teaching, there is significant opportunity to increase the use of CSG in enhancing learning and learner achievement, especially in Level 5 modules. The use of CSG is situational and its adoption should be undertaken on a case-by-case basis. WBS can play a major role by creating an environment that encourages and supports the use of CSG as well as other forms of innovative learning and teaching methods. Thus the key recommendation involves providing module teams further support in embedding and integrating CSG into their modules.
Resumo:
Computers have invaded our offices, our homes, cars and coffee-pots; they have become ubiquitous. However, the advance of computing technologies is associated with an increasing lack of “visibility” of the underlying software and hardware technologies. While we use and accept the computer, we neither know its history nor functionality. In this paper, we argue that this is not a healthy situation. Also, recruitment onto UK Computing degree courses is steadily falling; these courses are appearing less attractive to school-leavers. This may be associated with the increasing ubiquity. In this paper we reflect on an MSc. module of instruction, Concepts and Philosophy of Computing, and a BSc. module Computer Games Development developed at the University of Worcester which address these issues. We propose that the elements of these modules form a necessary part of the education of all citizens, and we suggest how this may be realized. We also suggest how to re-enthuse our youth about computing as a discipline and halt the drop in recruitment.
Resumo:
Computers have invaded our offices, our homes, cars and coffee-pots; they have become ubiquitous. However, the advance of computing technologies is associated with an increasing lack of “visibility” of the underlying software and hardware technologies. While we use and accept the computer, we neither know its history nor functionality. In this paper, we argue that this is not a healthy situation. Also, recruitment onto UK Computing degree courses is steadily falling; these courses are appearing less attractive to school-leavers. This may be associated with the increasing ubiquity. In this paper we reflect on an MSc. module of instruction, Concepts and Philosophy of Computing, and a BSc. module Computer Games Development developed at the University of Worcester which address these issues. We propose that the elements of these modules form a necessary part of the education of all citizens, and we suggest how this may be realized. We also suggest how to re-enthuse our youth about computing as a discipline and halt the drop in recruitment.
Resumo:
The work presented in my thesis addresses the two cornerstones of modern astronomy: Observation and Instrumentation. Part I deals with the observation of two nearby active galaxies, the Seyfert 2 galaxy NGC 1433 and the Seyfert 1 galaxy NGC 1566, both at a distance of $\sim10$ Mpc, which are part of the Nuclei of Galaxies (NUGA) sample. It is well established that every galaxy harbors a super massive black hole (SMBH) at its center. Furthermore, there seems to be a fundamental correlation between the stellar bulge and SMBH masses. Simulations show that massive feedback, e.g., powerful outflows, in Quasi Stellar Objects (QSOs) has an impact on the mutual growth of bulge and SMBH. Nearby galaxies follow this relation but accrete mass at much lower rates. This gives rise to the following questions: Which mechanisms allow feeding of nearby Active Galactic Nuclei (AGN)? Is this feeding triggered by events, e.g., star formation, nuclear spirals, outflows, on $\sim500$ pc scales around the AGN? Does feedback on these scales play a role in quenching the feeding process? Does it have an effect on the star formation close to the nucleus? To answer these questions I have carried out observations with the Spectrograph for INtegral Field Observation in the Near Infrared (SINFONI) at the Very Large Telescope (VLT) situated on Cerro Paranal in Chile. I have reduced and analyzed the recorded data, which contain spatial and spectral information in the H-band ($1.45 \mic-1.85 \mic$) and K-band ($1.95 \mic-2.45 \mic$) on the central $10\arcsec\times10\arcsec$ of the observed galaxies. Additionally, Atacama Large Millimeter/Sub-millimeter Array (ALMA) data at $350$ GHz ($\sim0.87$ mm) as well as optical high resolution Hubble Space Telescope (HST) images are used for the analysis. For NGC 1433 I deduce from comparison of the distributions of gas, dust, and intensity of highly ionized emission lines that the galaxy center lies $\sim70$ pc north-northwest of the prior estimate. A velocity gradient is observed at the new center, which I interpret as a bipolar outflow, a circum nuclear disk, or a combination of both. At least one dust and gas arm leads from a $r\sim200$ pc ring towards the nucleus and might feed the SMBH. Two bright warm H$_2$ gas spots are detected that indicate hidden star formation or a spiral arm-arm interaction. From the stellar velocity dispersion (SVD) I estimate a SMBH mass of $\sim1.74\times10^7$ \msol. For NGC 1566 I observe a nuclear gas disk of $\sim150$ pc in radius with a spiral structure. I estimate the total mass of this disk to be $\sim5.4\times10^7$ \msol. What mechanisms excite the gas in the disk is not clear. Neither can the existence of outflows be proven nor is star formation detected over the whole disk. On one side of the spiral structure I detect a star forming region with an estimated star formation rate of $\sim2.6\times10^{-3}$ \msol\ yr$^{-1}$. From broad Br$\gamma$ emission and SVD I estimate a mean SMBH mass of $\sim5.3\times10^6$ \msol\ with an Eddington ratio of $\sim2\times10^{-3}$. Part II deals with the final tests of the Fringe and Flexure Tracker (FFTS) for LBT INterferometric Camera and the NIR/Visible Adaptive iNterferometer for Astronomy (LINC-NIRVANA) at the Large Binocular Telescope (LBT) in Arizona, USA, which I conducted. The FFTS is the subsystem that combines the two separate beams of the LBT and enables near-infrared interferometry with a significantly large field of view. The FFTS has a cryogenic system and an ambient temperature system which are separated by the baffle system. I redesigned this baffle to guarantee the functionality of the system after the final tests in the Cologne cryostat. The redesign did not affect any scientific performance of LINC-NIRVANA. I show in the final cooldown tests that the baffle fulfills the temperature requirement and stays $<110$ K whereas the moving stages in the ambient system stay $>273$ K, which was not given for the old baffle design. Additionally, I test the tilting flexure of the whole FFTS and show that accurate positioning of the detector and the tracking during observation can be guaranteed.
Resumo:
Performing Macroscopy in Pathology implies to plan and implement methods of selection, description and collection of biological material from human organs and tissues, actively contributing to the clinical pathology analysis by preparing macroscopic report and the collection and identification of fragments, according to the standardized protocols and recognizing the criteria internationally established for determining the prognosis. The Macroscopy in Pathology course is a full year program with theoretical and pratical components taught by Pathologists. It is divided by organ/system surgical pathology into weekly modules and includes a practical "hands-on" component in Pathology Departments. The students are 50 biomedical scientists aged from 22 to 50 years old from all across the country that want to acquire competences in macroscopy. A blended learning strategy was used in order to: give students the opportunity to attend from distance; support the contents, lessons and the interaction with colleagues and teachers; facilitate the formative/summative assessment.