876 resultados para Requirements elicitation
Resumo:
Peer reviewed
Resumo:
Postprint
Resumo:
Aims: To determine the self-assessed continuing professional development (CPD) needs of dental practitioners and identify how each discipline can best be served by a dental CPD programme. To set findings in the context of the available literature and contribute to the development of CPD programmes. Method: Topics were arranged into eight disciplines: practice management; paediatric dentistry; preventive dentistry; orthodontics; behaviour management; dentistry for people with a disability; oral medicine and surgery; and, restorative dentistry. A web-based questionnaire was constructed and administered using a MarkClass 2.21 online survey tool. Results: Fifty-six self-reported assessment responses were received, with three-quarters of participants having graduated within the past 10 years. Topics in oral medicine and surgery attracted consistently high levels of interest. A tendency to favour topics with a perceived direct clinical application was observed. Topics recommended by the Dental Council as core areas for CPD were given a high level of priority by respondents. Conclusions: Traditional lectures remain a valued mode of CPD participation. Practical courses were valued across all dental topics offered. A varied approach to determining the requirements of dentists is essential to appropriately support the practitioner.
Resumo:
I explore and analyze a problem of finding the socially optimal capital requirements for financial institutions considering two distinct channels of contagion: direct exposures among the institutions, as represented by a network and fire sales externalities, which reflect the negative price impact of massive liquidation of assets.These two channels amplify shocks from individual financial institutions to the financial system as a whole and thus increase the risk of joint defaults amongst the interconnected financial institutions; this is often referred to as systemic risk. In the model, there is a trade-off between reducing systemic risk and raising the capital requirements of the financial institutions. The policymaker considers this trade-off and determines the optimal capital requirements for individual financial institutions. I provide a method for finding and analyzing the optimal capital requirements that can be applied to arbitrary network structures and arbitrary distributions of investment returns.
In particular, I first consider a network model consisting only of direct exposures and show that the optimal capital requirements can be found by solving a stochastic linear programming problem. I then extend the analysis to financial networks with default costs and show the optimal capital requirements can be found by solving a stochastic mixed integer programming problem. The computational complexity of this problem poses a challenge, and I develop an iterative algorithm that can be efficiently executed. I show that the iterative algorithm leads to solutions that are nearly optimal by comparing it with lower bounds based on a dual approach. I also show that the iterative algorithm converges to the optimal solution.
Finally, I incorporate fire sales externalities into the model. In particular, I am able to extend the analysis of systemic risk and the optimal capital requirements with a single illiquid asset to a model with multiple illiquid assets. The model with multiple illiquid assets incorporates liquidation rules used by the banks. I provide an optimization formulation whose solution provides the equilibrium payments for a given liquidation rule.
I further show that the socially optimal capital problem using the ``socially optimal liquidation" and prioritized liquidation rules can be formulated as a convex and convex mixed integer problem, respectively. Finally, I illustrate the results of the methodology on numerical examples and
discuss some implications for capital regulation policy and stress testing.
Resumo:
The sense and avoid capability is one of the greatest challenges that has to be addressed to safely integrate unmanned aircraft systems into civil and nonsegregated airspace. This paper gives a review of existing regulations, recommended practices, and standards in sense and avoid for unmanned aircraft systems. Gaps and issues are identified, as are the different factors that are likely to affect actual sense and avoid requirements. It is found that the operational environment (flight altitude, meteorological conditions, and class of airspace) plays an important role when determining the type of flying hazards that the unmanned aircraft system might encounter. In addition, the automation level and the data-link architecture of the unmanned aircraft system are key factors that will definitely determine the sense and avoid system requirements. Tactical unmanned aircraft, performing similar missions to general aviation, are found to be the most challenging systems from an sense and avoid point of view, and further research and development efforts are still needed before their seamless integration into nonsegregated airspace
Resumo:
Body size is a key determinant of metabolic rate, but logistical constraints have led to a paucity of energetics measurements from large water-breathing animals. As a result, estimating energy requirements of large fish generally relies on extrapolation of metabolic rate from individuals of lower body mass using allometric relationships that are notoriously variable. Swim-tunnel respirometry is the ‘gold standard’ for measuring active metabolic rates in water-breathing animals, yet previous data are entirely derived from body masses <10 kg – at least one order of magnitude lower than the body masses of many top-order marine predators. Here, we describe the design and testing of a new method for measuring metabolic rates of large water-breathing animals: a c. 26 000 L seagoing ‘mega-flume’ swim-tunnel respirometer. We measured the swimming metabolic rate of a 2·1-m, 36-kg zebra shark Stegostoma fasciatum within this new mega-flume and compared the results to data we collected from other S. fasciatum (3·8–47·7 kg body mass) swimming in static respirometers and previously published measurements of active metabolic rate measurements from other shark species. The mega-flume performed well during initial tests, with intra- and interspecific comparisons suggesting accurate metabolic rate measurements can be obtained with this new tool. Inclusion of our data showed that the scaling exponent of active metabolic rate with mass for sharks ranging from 0·13 to 47·7 kg was 0·79; a similar value to previous estimates for resting metabolic rates in smaller fishes. We describe the operation and usefulness of this new method in the context of our current uncertainties surrounding energy requirements of large water-breathing animals. We also highlight the sensitivity of mass-extrapolated energetic estimates in large aquatic animals and discuss the consequences for predicting ecosystem impacts such as trophic cascades.
Resumo:
Body size is a key determinant of metabolic rate, but logistical constraints have led to a paucity of energetics measurements from large water-breathing animals. As a result, estimating energy requirements of large fish generally relies on extrapolation of metabolic rate from individuals of lower body mass using allometric relationships that are notoriously variable. Swim-tunnel respirometry is the ‘gold standard’ for measuring active metabolic rates in water-breathing animals, yet previous data are entirely derived from body masses <10 kg – at least one order of magnitude lower than the body masses of many top-order marine predators. Here, we describe the design and testing of a new method for measuring metabolic rates of large water-breathing animals: a c. 26 000 L seagoing ‘mega-flume’ swim-tunnel respirometer. We measured the swimming metabolic rate of a 2·1-m, 36-kg zebra shark Stegostoma fasciatum within this new mega-flume and compared the results to data we collected from other S. fasciatum (3·8–47·7 kg body mass) swimming in static respirometers and previously published measurements of active metabolic rate measurements from other shark species. The mega-flume performed well during initial tests, with intra- and interspecific comparisons suggesting accurate metabolic rate measurements can be obtained with this new tool. Inclusion of our data showed that the scaling exponent of active metabolic rate with mass for sharks ranging from 0·13 to 47·7 kg was 0·79; a similar value to previous estimates for resting metabolic rates in smaller fishes. We describe the operation and usefulness of this new method in the context of our current uncertainties surrounding energy requirements of large water-breathing animals. We also highlight the sensitivity of mass-extrapolated energetic estimates in large aquatic animals and discuss the consequences for predicting ecosystem impacts such as trophic cascades.
Resumo:
This work studies the creation, development and evolution of the Spanish midshipmen, through their requirements, from the early 18th century to the second third of the 19th century. The aims are: to know deeply an essential post in the Navy –naval cadet- that still requires an exhaustive review; and, primarily, to link the transformations of the candidates’ requirements to the social debates which were proposed in Spain during this two centuries. The so-called transition from the estates society to the class society, or the step from a society of inheriting criteria to a society based on achieving variables, did not occur linearly; changes and inertia went hand in hand with contradictions and conflicts, thus in this paper the Spanish midshipmen are analysed by several relationally variables.
Resumo:
Requirements Engineering (RE) has received much attention in research and practice due to its importance to software project success. Its inter-disciplinary nature, the dependency to the customer, and its inherent uncertainty still render the discipline diffcult to investigate. This results in a lack of empirical data. These are necessary, however, to demonstrate which practically relevant RE problems exist and to what extent they matter. Motivated by this situation, we initiated the Naming the Pain in Requirements Engineering (NaPiRE) initiative which constitutes a globally distributed, bi-yearly replicated family of surveys on the status quo and problems in practical RE.
In this article, we report on the analysis of data obtained from 228 companies in 10 countries. We apply Grounded Theory to the data obtained from NaPiRE and reveal which contemporary problems practitioners encounter. To this end, we analyse 21 problems derived from the literature with respect to their relevance and criticality in dependency to their context, and we complement this picture with a cause-effect analysis showing the causes and effects surrounding the most critical problems.
Our results give us a better understanding of which problems exist and how they manifest themselves in practical environments. Thus, we provide a rst step to ground contributions to RE on empirical observations which, by now, were dominated by conventional wisdom only.
Resumo:
Face-to-face interviews are a fundamental research tool in qualitative research. Whilst this form of data collection can provide many valuable insights, it can often fall short of providing a complete picture of a research subject's experiences. Point of view (PoV) interviewing is an elicitation technique used in the social sciences as a means of enriching data obtained from research interviews. Recording research subjects' first person perspectives, for example by wearing digital video glasses, can afford deeper insights into their experiences. PoV interviewing can promote making visible the unverbalizable and does not rely as much on memory as the traditional interview. The use of such relatively inexpensive technology is gaining interest in health profession educational research and pedagogy, such as dynamic simulation-based learning and research activities. In this interview, Dr Gerry Gormley (a medical education researcher) talks to Dr Jonathan Skinner (an anthropologist with an interest in PoV interviewing), exploring some of the many crossover implications with PoV interviewing for medical education research and practice.
Resumo:
The paper addresses the technological change that is currently happening in industry. First, a review of the global trends that impact industrial developmentsis made, then a summary ofexpanding intelligent technologies and their systems. The report describes in detail the concept of Industry 4.0 and its major technology-related aspects. At the end of the paper, a summary of social consequences is addressed, especially concerning generational concerns connected to the current change in industrial technology. The purpose of the study is to raise some special aspects and considerations in the given subject.
Resumo:
Automated acceptance testing is the testing of software done in higher level to test whether the system abides by the requirements desired by the business clients by the use of piece of script other than the software itself. This project is a study of the feasibility of acceptance tests written in Behavior Driven Development principle. The project includes an implementation part where automated accep- tance testing is written for Touch-point web application developed by Dewire (a software consultant company) for Telia (a telecom company) from the require- ments received from the customer (Telia). The automated acceptance testing is in Cucumber-Selenium framework which enforces Behavior Driven Development principles. The purpose of the implementation is to verify the practicability of this style of acceptance testing. From the completion of implementation, it was concluded that all the requirements from customer in real world can be converted into executable specifications and the process was not at all time-consuming or difficult for a low-experienced programmer like the author itself. The project also includes survey to measure the learnability and understandability of Gherkin- the language that Cucumber understands. The survey consist of some Gherkin exam- ples followed with questions that include making changes to the Gherkin exam- ples. Survey had 3 parts: first being easy, second medium and third most difficult. Survey also had a linear scale from 1 to 5 to rate the difficulty level for each part of the survey. 1 stood for very easy and 5 for very difficult. Time when the partic- ipants began the survey was also taken in order to calculate the total time taken by the participants to learn and answer the questions. Survey was taken by 18 of the employers of Dewire who had primary working role as one of the programmer, tester and project manager. In the result, tester and project manager were grouped as non-programmer. The survey concluded that it is very easy and quick to learn Gherkin. While the participants rated Gherkin as very easy.
Resumo:
Providing good customer service, inexpensively, is a problem commonly faced by managers of service operations. To tackle this problem, managers must do four tasks: forecast customer demand for the service; translate these forecasts into employee requirements; develop a labor schedule that provides appropriate numbers of employees at appropriate times; and control the delivery of the service in real-time. This paper focuses upon the translation of forecasts of customer demand into employee requirements. Specifically, it presents and evaluates two methods for determining desired staffing levels. One of these methods is a traditional approach to the task, while the other, by using modified customer arrival rates, offers a better means of accounting for the multi-period impact of customer service. To calculate the modified arrival rates, the latter method reduces (increases) the actual customer arrival rate for a period to account for customers who arrived in the period (in earlier periods) but have some of their service performed in subsequent periods (in the period). In an experiment simulating 13824 service delivery environments, the new method demonstrated its superiority by serving 2.74% more customers within the specified waiting time limit while using 7.57% fewer labor hours.
Resumo:
The use of secondary data in health care research has become a very important issue over the past few years. Data from the treatment context are being used for evaluation of medical data for external quality assurance, as well as to answer medical questions in the form of registers and research databases. Additionally, the establishment of electronic clinical systems like data warehouses provides new opportunities for the secondary use of clinical data. Because health data is among the most sensitive information about an individual, the data must be safeguarded from disclosure.
Resumo:
Este estudio corresponde a la primera fase de investigación formativa recomendada en la teoría de la Acción Planeada para desarrollar una intervención. Nuestros objetivos son identificar las creencias modales sobre la realización de una pauta de ejercicio en personas con fibromialgia, probar los ítems para la evaluación directa de los constructos predictivos y explorar sus relaciones con la conducta. Evaluamos a 46 mujeres con fibromialgia. El análisis de contenido mostró un mayor número de consecuencias positivas que negativas asociadas a la ejecución de la pauta de ejercicio (creencias comportamentales); la familia y los amigos son los referentes importantes (creencias normativas) y se detectaron factores facilitadores e inhibidores de la ejecución de la conducta de ejercicio, relacionados con aspectos de la fibromialgia (creencias de control) como el dolor, la fatiga y el estado de ánimo. El índice de consistencia interna más bajo fue el de la escala de norma subjetiva (α= .78). Los resultados confirman el sedentarismo de la muestra (conducta previa: Media=3.67; rango=1-7) aunque también sugieren que estas personas tienen intención de realizar la conducta (Media= 5.67). Las relaciones obtenidas entre los constructos son las esperadas desde la teoría, apoyando la pertinencia de aplicarla en la conducta y población seleccionada.