791 resultados para Concerns Based Adoption Model CBAM
Resumo:
This article reports on a conversation between 12 lesbian, gay, bisexual and transgender (LGBT) psychologists at the first international LGBT Psychology Summer Institute at the University of Michigan in August 2009. Participants discuss how their work in LGBT psychology is affected by national policy, funding and academic contexts and the transnational influence of the US-based stigma model of LGBT psychology. The challenges and possibilities posed by internationalism are discussed with reference to the dominance of the United States, the cultural limits of terms such as 'lesbian, gay, bisexual and transgender', intergenerational communication between researchers and the role of events such as the Summer Institute in creating an international community of LGBT psychologists. © 2010 Taylor & Francis.
Resumo:
The paper presents a case study of geo-monitoring a region consisting in the capturing and encoding of human expertise into a knowledge-based system. As soon as the maps have been processed, the data patterns are detected using knowledge-based agents for the harvest prognosis.
Resumo:
A pénzügy kutatócsoport a TÁMOP-4.2.1.B-09/1/KMR-2010-0005 azonosítójú projektjében igen szerteágazó elemzési munkát végzett. Rámutattunk, hogy a különböző szintű gazdasági szereplők megnövekedett tőkeáttétele egyértelműen a rendszerkockázat növekedéséhez vezet, hiszen nő az egyes szereplők csődjének valószínűsége. Ha a tőkeáttételt eltérő mértékben és ütemben korlátozzák az egyes szektorokban, országokban akkor a korlátozást később bevezető szereplők egyértelműen versenyelőnyhöz jutnak. Az egyes pénzügyi intézmények tőkeallokációját vizsgálva kimutattuk, hogy a különféle divíziók közt mindig lehetséges a működés fedezetésül szolgáló tőkét (kockázatot) úgy felosztani, hogy a megállapodás felmondás egyik érintettnek se álljon érdekében. Ezt azonban nem lehet minden szempontból igazságosan megtenni, így egyes üzletágak versenyhátrányba kerülhetnek, ha a konkurens piaci szereplők az adott tevékenységet kevésbé igazságtalanul terhelték meg. Kimutattunk, hogy az egyes nyugdíjpénztárak befektetési tevékenységének eredményességére nagy hatással van a magánnyugdíjpénztárak tevékenységének szabályozása. Ezek a jogszabályok a társadalom hosszú távú versenyképességére vannak hatással. Rámutattunk arra is, hogy a gazdasági válság előtt a hazai bankok sem voltak képesek ügyfeleik kockázatviselő képességét helyesen megítélni, ráadásul jutalékrendszerük nem is tette ebben érdekelté azokat. Számos vizsgálatunk foglalkozott a magyar vállalatok versenyképességének alakulásával is. Megvizsgáltuk a különféle adónemek, árfolyamkockázatok és finanszírozási politikák versenyképességet befolyásoló hatását. Külön kutatás vizsgálta a kamatlábak ingadozásának és az hitelekhez kapcsolódó eszközfedezet meglétének vállalati értékre gyakorolt hatásait. Rámutattunk a nemfizetés növekvő kockázatára, és áttekintettük a lehetséges és a ténylegesen alkalmazott kezelési stratégiákat is. Megvizsgáltuk azt is, hogy a tőzsdei cégek tulajdonosai miként használják ki az osztalékfizetéshez kapcsolódó adóoptimalizálási lehetőségeket. Gyakorlati piaci tapasztalataik alapján az adóelkerülő kereskedést a befektetők a részvények egy jelentős részénél végrehajtják. Külön kutatás foglakozott a szellemi tőke hazai vállalatoknál játszott szerepéről. Ez alapján a cégek a problémát 2009-ben lényegesen magasabb szakértelemmel kezelték, mint öt esztendővel korábban. Rámutattunk arra is, hogy a tulajdonosi háttér lényeges hatást gyakorolhat arra, ahogyan a cégek célrendszerüket felépítik, illetve ahogy az intellektuális javakra tekintenek. _____ The Finance research team has covered a wide range of research fields while taking part at project TÁMOP-4.2.1.B-09/1/KMR-2010-0005. It has been shown that the increasing financial gearing at the different economic actors clearly leads to growth in systematic risk as the probability of bankruptcy climbs upwards. Once the leverage is limited at different levels and at different points in time for the different sectors, countries introducing the limitations later gain clearly a competitive advantage. When investigating the leverage at financial institutions we found that the capital requirement of the operation can always be divided among divisions so that none of them would be better of with cancelling the cooperation. But this cannot be always done fairly from all point of view meaning some of the divisions may face a competitive disadvantage if competitors charge their similar division less unfairly. Research has also shown that the regulation of private pension funds has vital effect on the profitability of the investment activity of the funds. These laws and regulations do not only affect the funds themselves but also the competitiveness of the whole society. We have also fund that Hungarian banks were unable to estimate correctly the risk taking ability of their clients before the economic crisis. On the top of that the bank were not even interested in that due to their commission based income model. We also carried out several research on the competitiveness of the Hungarian firms. The effect of taxes, currency rate risks, and financing policies on competitiveness has been analysed in detail. A separate research project was dedicated to the effect of the interest rate volatility and asset collaterals linked to debts on the value of the firm. The increasing risk of non-payment has been underlined and we also reviewed the adequate management strategies potentially available and used in real life. We also investigated how the shareholders of listed companies use the tax optimising possibilities linked to dividend payments. Based on our findings on the Hungarian markets the owners perform the tax evading trades in case of the most shares. A separate research has been carried out on the role played by intellectual capital. After that the Hungarian companies dealt with the problem in 2009 with far higher proficiency than five years earlier. We also pointed out that the ownership structure has a considerable influence on how firms structure their aims and view their intangible assets.
Resumo:
The National Council Licensure Examination for Registered Nurses (NCLEX-RN) is the examination that all graduates of nursing education programs must pass to attain the title of registered nurse. Currently the NCLEX-RN passing rate is at an all-time low (81%) for first-time test takers (NCSBN, 2004); amidst a nationwide shortage of registered nurses (Glabman, 2001). Because of the critical need to supply greater numbers of professional nurses, and the potential accreditation ramifications that low NCLEX-RN passing rates can have on schools of nursing and graduates, this research study tests the effectiveness of a predictor model. This model is based upon the theoretical framework of McClusky's (1959) theory of margin (ToM), with the hope that students found to be at-risk for NCLEX-RN failure can be identified and remediated prior to taking the actual licensure examination. To date no theory based predictor model has been identified that predicts success on the NCLEX-RN. ^ The model was tested using prerequisite course grades, nursing course grades and scores on standardized examinations for the 2003 associate degree nursing graduates at a urban community college (N = 235). Success was determined through the reporting of pass on the NCLEX-RN examination by the Florida Board of Nursing. Point biserial correlations tested model assumptions regarding variable relationships, while logistic regression was used to test the model's predictive power. ^ Correlations among variables were significant and the model accounted for 66% of variance in graduates' success on the NCLEX-RN with 98% prediction accuracy. Although certain prerequisite course grades and nursing course grades were found to be significant to NCLEX-RN success, the overall model was found to be most predictive at the conclusion of the academic program of study. The inclusion of the RN Assessment Examination, taken during the final semester of course work, was the most significant predictor of NCLEX-RN success. Success on the NCLEX-RN allows graduates to work as registered nurses, reflects positively on a school's academic performance record, and supports the appropriateness of the educational program's goals and objectives. The study's findings support potential other uses of McClusky's theory of margin as a predictor of program outcome in other venues of adult education. ^
Resumo:
In recent years, wireless communication infrastructures have been widely deployed for both personal and business applications. IEEE 802.11 series Wireless Local Area Network (WLAN) standards attract lots of attention due to their low cost and high data rate. Wireless ad hoc networks which use IEEE 802.11 standards are one of hot spots of recent network research. Designing appropriate Media Access Control (MAC) layer protocols is one of the key issues for wireless ad hoc networks. ^ Existing wireless applications typically use omni-directional antennas. When using an omni-directional antenna, the gain of the antenna in all directions is the same. Due to the nature of the Distributed Coordination Function (DCF) mechanism of IEEE 802.11 standards, only one of the one-hop neighbors can send data at one time. Nodes other than the sender and the receiver must be either in idle or listening state, otherwise collisions could occur. The downside of the omni-directionality of antennas is that the spatial reuse ratio is low and the capacity of the network is considerably limited. ^ It is therefore obvious that the directional antenna has been introduced to improve spatial reutilization. As we know, a directional antenna has the following benefits. It can improve transport capacity by decreasing interference of a directional main lobe. It can increase coverage range due to a higher SINR (Signal Interference to Noise Ratio), i.e., with the same power consumption, better connectivity can be achieved. And the usage of power can be reduced, i.e., for the same coverage, a transmitter can reduce its power consumption. ^ To utilizing the advantages of directional antennas, we propose a relay-enabled MAC protocol. Two relay nodes are chosen to forward data when the channel condition of direct link from the sender to the receiver is poor. The two relay nodes can transfer data at the same time and a pipelined data transmission can be achieved by using directional antennas. The throughput can be improved significant when introducing the relay-enabled MAC protocol. ^ Besides the strong points, directional antennas also have some explicit drawbacks, such as the hidden terminal and deafness problems and the requirements of retaining location information for each node. Therefore, an omni-directional antenna should be used in some situations. The combination use of omni-directional and directional antennas leads to the problem of configuring heterogeneous antennas, i e., given a network topology and a traffic pattern, we need to find a tradeoff between using omni-directional and using directional antennas to obtain a better network performance over this configuration. ^ Directly and mathematically establishing the relationship between the network performance and the antenna configurations is extremely difficult, if not intractable. Therefore, in this research, we proposed several clustering-based methods to obtain approximate solutions for heterogeneous antennas configuration problem, which can improve network performance significantly. ^ Our proposed methods consist of two steps. The first step (i.e., clustering links) is to cluster the links into different groups based on the matrix-based system model. After being clustered, the links in the same group have similar neighborhood nodes and will use the same type of antenna. The second step (i.e., labeling links) is to decide the type of antenna for each group. For heterogeneous antennas, some groups of links will use directional antenna and others will adopt omni-directional antenna. Experiments are conducted to compare the proposed methods with existing methods. Experimental results demonstrate that our clustering-based methods can improve the network performance significantly. ^
Resumo:
This dissertation aims to improve the performance of existing assignment-based dynamic origin-destination (O-D) matrix estimation models to successfully apply Intelligent Transportation Systems (ITS) strategies for the purposes of traffic congestion relief and dynamic traffic assignment (DTA) in transportation network modeling. The methodology framework has two advantages over the existing assignment-based dynamic O-D matrix estimation models. First, it combines an initial O-D estimation model into the estimation process to provide a high confidence level of initial input for the dynamic O-D estimation model, which has the potential to improve the final estimation results and reduce the associated computation time. Second, the proposed methodology framework can automatically convert traffic volume deviation to traffic density deviation in the objective function under congested traffic conditions. Traffic density is a better indicator for traffic demand than traffic volume under congested traffic condition, thus the conversion can contribute to improving the estimation performance. The proposed method indicates a better performance than a typical assignment-based estimation model (Zhou et al., 2003) in several case studies. In the case study for I-95 in Miami-Dade County, Florida, the proposed method produces a good result in seven iterations, with a root mean square percentage error (RMSPE) of 0.010 for traffic volume and a RMSPE of 0.283 for speed. In contrast, Zhou's model requires 50 iterations to obtain a RMSPE of 0.023 for volume and a RMSPE of 0.285 for speed. In the case study for Jacksonville, Florida, the proposed method reaches a convergent solution in 16 iterations with a RMSPE of 0.045 for volume and a RMSPE of 0.110 for speed, while Zhou's model needs 10 iterations to obtain the best solution, with a RMSPE of 0.168 for volume and a RMSPE of 0.179 for speed. The successful application of the proposed methodology framework to real road networks demonstrates its ability to provide results both with satisfactory accuracy and within a reasonable time, thus establishing its potential usefulness to support dynamic traffic assignment modeling, ITS systems, and other strategies.
Resumo:
An iterative travel time forecasting scheme, named the Advanced Multilane Prediction based Real-time Fastest Path (AMPRFP) algorithm, is presented in this dissertation. This scheme is derived from the conventional kernel estimator based prediction model by the association of real-time nonlinear impacts that caused by neighboring arcs’ traffic patterns with the historical traffic behaviors. The AMPRFP algorithm is evaluated by prediction of the travel time of congested arcs in the urban area of Jacksonville City. Experiment results illustrate that the proposed scheme is able to significantly reduce both the relative mean error (RME) and the root-mean-squared error (RMSE) of the predicted travel time. To obtain high quality real-time traffic information, which is essential to the performance of the AMPRFP algorithm, a data clean scheme enhanced empirical learning (DCSEEL) algorithm is also introduced. This novel method investigates the correlation between distance and direction in the geometrical map, which is not considered in existing fingerprint localization methods. Specifically, empirical learning methods are applied to minimize the error that exists in the estimated distance. A direction filter is developed to clean joints that have negative influence to the localization accuracy. Synthetic experiments in urban, suburban and rural environments are designed to evaluate the performance of DCSEEL algorithm in determining the cellular probe’s position. The results show that the cellular probe’s localization accuracy can be notably improved by the DCSEEL algorithm. Additionally, a new fast correlation technique for overcoming the time efficiency problem of the existing correlation algorithm based floating car data (FCD) technique is developed. The matching process is transformed into a 1-dimensional (1-D) curve matching problem and the Fast Normalized Cross-Correlation (FNCC) algorithm is introduced to supersede the Pearson product Moment Correlation Co-efficient (PMCC) algorithm in order to achieve the real-time requirement of the FCD method. The fast correlation technique shows a significant improvement in reducing the computational cost without affecting the accuracy of the matching process.
Resumo:
In the past two decades, multi-agent systems (MAS) have emerged as a new paradigm for conceptualizing large and complex distributed software systems. A multi-agent system view provides a natural abstraction for both the structure and the behavior of modern-day software systems. Although there were many conceptual frameworks for using multi-agent systems, there was no well established and widely accepted method for modeling multi-agent systems. This dissertation research addressed the representation and analysis of multi-agent systems based on model-oriented formal methods. The objective was to provide a systematic approach for studying MAS at an early stage of system development to ensure the quality of design. ^ Given that there was no well-defined formal model directly supporting agent-oriented modeling, this study was centered on three main topics: (1) adapting a well-known formal model, predicate transition nets (PrT nets), to support MAS modeling; (2) formulating a modeling methodology to ease the construction of formal MAS models; and (3) developing a technique to support machine analysis of formal MAS models using model checking technology. PrT nets were extended to include the notions of dynamic structure, agent communication and coordination to support agent-oriented modeling. An aspect-oriented technique was developed to address the modularity of agent models and compositionality of incremental analysis. A set of translation rules were defined to systematically translate formal MAS models to concrete models that can be verified through the model checker SPIN (Simple Promela Interpreter). ^ This dissertation presents the framework developed for modeling and analyzing MAS, including a well-defined process model based on nested PrT nets, and a comprehensive methodology to guide the construction and analysis of formal MAS models.^
Resumo:
The major objectives of this dissertation were to develop optimal spatial techniques to model the spatial-temporal changes of the lake sediments and their nutrients from 1988 to 2006, and evaluate the impacts of the hurricanes occurred during 1998–2006. Mud zone reduced about 10.5% from 1988 to 1998, and increased about 6.2% from 1998 to 2006. Mud areas, volumes and weight were calculated using validated Kriging models. From 1988 to 1998, mud thicknesses increased up to 26 cm in the central lake area. The mud area and volume decreased about 13.78% and 10.26%, respectively. From 1998 to 2006, mud depths declined by up to 41 cm in the central lake area, mud volume reduced about 27%. Mud weight increased up to 29.32% from 1988 to 1998, but reduced over 20% from 1998 to 2006. The reduction of mud sediments is likely due to re-suspension and redistribution by waves and currents produced by large storm events, particularly Hurricanes Frances and Jeanne in 2004 and Wilma in 2005. Regression, kriging, geographically weighted regression (GWR) and regression-kriging models have been calibrated and validated for the spatial analysis of the sediments TP and TN of the lake. GWR models provide the most accurate predictions for TP and TN based on model performance and error analysis. TP values declined from an average of 651 to 593 mg/kg from 1998 to 2006, especially in the lake’s western and southern regions. From 1988 to 1998, TP declined in the northern and southern areas, and increased in the central-western part of the lake. The TP weights increased about 37.99%–43.68% from 1988 to 1998 and decreased about 29.72%–34.42% from 1998 to 2006. From 1988 to 1998, TN decreased in most areas, especially in the northern and southern lake regions; western littoral zone had the biggest increase, up to 40,000 mg/kg. From 1998 to 2006, TN declined from an average of 9,363 to 8,926 mg/kg, especially in the central and southern regions. The biggest increases occurred in the northern lake and southern edge areas. TN weights increased about 15%–16.2% from 1988 to 1998, and decreased about 7%–11% from 1998 to 2006.
Resumo:
Shallow marine ecosystems are experiencing significant environmental alterations as a result of changing climate and increasing human activities along coasts. Intensive urbanization of the southeast Florida coast and intensification of climate change over the last few centuries changed the character of coastal ecosystems in the semi-enclosed Biscayne Bay, Florida. In order to develop management policies for the Bay, it is vital to obtain reliable scientific evidence of past ecological conditions. The long-term records of subfossil diatoms obtained from No Name Bank and Featherbed Bank in the Central Biscayne Bay, and from the Card Sound Bank in the neighboring Card Sound, were used to study the magnitude of the environmental change caused by climate variability and water management over the last ~ 600 yr. Analyses of these records revealed that the major shifts in the diatom assemblage structures at No Name Bank occurred in 1956, at Featherbed Bank in 1966, and at Card Sound Bank in 1957. Smaller magnitude shifts were also recorded at Featherbed Bank in 1893, 1942, 1974 and 1983. Most of these changes coincided with severe drought periods that developed during the cold phases of El Niño Southern Oscillation (ENSO), Atlantic Multidecadal Oscillation (AMO) and Pacific Decadal Oscillation (PDO), or when AMO was in warm phase and PDO was in the cold phase. Only the 1983 change coincided with an unusually wet period that developed during the warm phases of ENSO and PDO. Quantitative reconstructions of salinity using the weighted averaging partial least squares (WA-PLS) diatom-based salinity model revealed a gradual increase in salinity at the three coring locations over the last ~ 600 yr, which was primarily caused by continuously rising sea level and in the last several decades also by the reduction of the amount of freshwater inflow from the mainland. Concentration of sediment total nitrogen (TN), total phosphorus (TP) and total organic carbon (TOC) increased in the second half of the 20th century, which coincided with the construction of canals, landfills, marinas and water treatment plants along the western margin of Biscayne Bay. Increased magnitude and rate of the diatom assemblage restructuring in the mid- and late-1900s, suggest that large environmental changes are occurring more rapidly now than in the past.
Resumo:
Shallow marine ecosystems are experiencing significant environmental alterations as a result of changing climate and increasing human activities along coasts. Intensive urbanization of the southeast Florida coast and intensification of climate change over the last few centuries changed the character of coastal ecosystems in the semi-enclosed Biscayne Bay, Florida. In order to develop management policies for the Bay, it is vital to obtain reliable scientific evidence of past ecological conditions. The long-term records of subfossil diatoms obtained from No Name Bank and Featherbed Bank in the Central Biscayne Bay, and from the Card Sound Bank in the neighboring Card Sound, were used to study the magnitude of the environmental change caused by climate variability and water management over the last ~ 600 yr. Analyses of these records revealed that the major shifts in the diatom assemblage structures at No Name Bank occurred in 1956, at Featherbed Bank in 1966, and at Card Sound Bank in 1957. Smaller magnitude shifts were also recorded at Featherbed Bank in 1893, 1942, 1974 and 1983. Most of these changes coincided with severe drought periods that developed during the cold phases of El Niño Southern Oscillation (ENSO), Atlantic Multidecadal Oscillation (AMO) and Pacific Decadal Oscillation (PDO), or when AMO was in warm phase and PDO was in the cold phase. Only the 1983 change coincided with an unusually wet period that developed during the warm phases of ENSO and PDO. Quantitative reconstructions of salinity using the weighted averaging partial least squares (WA-PLS) diatom-based salinity model revealed a gradual increase in salinity at the three coring locations over the last ~ 600 yr, which was primarily caused by continuously rising sea level and in the last several decades also by the reduction of the amount of freshwater inflow from the mainland. Concentration of sediment total nitrogen (TN), total phosphorus (TP) and total organic carbon (TOC) increased in the second half of the 20th century, which coincided with the construction of canals, landfills, marinas and water treatment plants along the western margin of Biscayne Bay. Increased magnitude and rate of the diatom assemblage restructuring in the mid- and late-1900s, suggest that large environmental changes are occurring more rapidly now than in the past.
Resumo:
Mesoscale eddies play a major role in controlling ocean biogeochemistry. By impacting nutrient availability and water column ventilation, they are of critical importance for oceanic primary production. In the eastern tropical South Pacific Ocean off Peru, where a large and persistent oxygen-deficient zone is present, mesoscale processes have been reported to occur frequently. However, investigations into their biological activity are mostly based on model simulations, and direct measurements of carbon and dinitrogen (N2) fixation are scarce. We examined an open-ocean cyclonic eddy and two anticyclonic mode water eddies: a coastal one and an open-ocean one in the waters off Peru along a section at 16°S in austral summer 2012. Molecular data and bioassay incubations point towards a difference between the active diazotrophic communities present in the cyclonic eddy and the anticyclonic mode water eddies. In the cyclonic eddy, highest rates of N2 fixation were measured in surface waters but no N2 fixation signal was detected at intermediate water depths. In contrast, both anticyclonic mode water eddies showed pronounced maxima in N2 fixation below the euphotic zone as evidenced by rate measurements and geochemical data. N2 fixation and carbon (C) fixation were higher in the young coastal mode water eddy compared to the older offshore mode water eddy. A co-occurrence between N2 fixation and biogenic N2, an indicator for N loss, indicated a link between N loss and N2 fixation in the mode water eddies, which was not observed for the cyclonic eddy. The comparison of two consecutive surveys of the coastal mode water eddy in November 2012 and December 2012 also revealed a reduction in N2 and C fixation at intermediate depths along with a reduction in chlorophyll by half, mirroring an aging effect in this eddy. Our data indicate an important role for anticyclonic mode water eddies in stimulating N2 fixation and thus supplying N offshore.
Resumo:
Megabenthos plays a major role in the overall energy flow on Arctic shelves, but information on megabenthic secondary production on large spatial scales is scarce. Here, we estimated for the first time megabenthic secondary production for the entire Barents Sea shelf by applying a species-based empirical model to an extensive dataset from the joint Norwegian? Russian ecosystem survey. Spatial patterns and relationships were analyzed within a GIS. The environmental drivers behind the observed production pattern were identified by applying an ordinary least squares regression model. Geographically weighted regression (GWR) was used to examine the varying relationship of secondary production and the environment on a shelfwide scale. Significantly higher megabenthic secondary production was found in the northeastern, seasonally ice-covered regions of the Barents Sea than in the permanently ice-free southwest. The environmental parameters that significantly relate to the observed pattern are bottom temperature and salinity, sea ice cover, new primary production, trawling pressure, and bottom current speed. The GWR proved to be a versatile tool for analyzing the regionally varying relationships of benthic secondary production and its environmental drivers (R² = 0.73). The observed pattern indicates tight pelagic? benthic coupling in the realm of the productive marginal ice zone. Ongoing decrease of winter sea ice extent and the associated poleward movement of the seasonal ice edge point towards a distinct decline of benthic secondary production in the northeastern Barents Sea in the future.
Resumo:
Eastern tropical Atlantic benthic foraminiferal Ba/Ca and Cd/Ca data from core V30-949 (3093 m) reveal large inferred changes in nutrient concentrations of deep Atlantic waters during the last 250 kyr. Relative changes in North Atlantic Deep Water contribution to this site are estimated by scaling the V30-49 Cd/Ca record to values of modern end-member water masses; these estimates agree well with the relative structure and timing of circulation changes in the eastern tropical Atlantic reconstructed from a d13C record-based mixing model (Raymo et al., 1997, doi:10.1029/97PA01019). Temporal differences between V30-49 Cd/Ca and Ba/Ca records suggest that the Ba/Ca record reflects changes in circulation with an additional increase in the Ba composition of deep Atlantic water masses during glacial episodes, possibly resulting from increased productivity. Similarity between the d13C and Ba/Ca records suggests that carbon isotopes in the deep glacial Atlantic also reflect productivity increases.
Resumo:
The hypothesis that the same educational objective, raised as cooperative or collaborative learning in university teaching does not affect students’ perceptions of the learning model, leads this study. It analyses the reflections of two students groups of engineering that shared the same educational goals implemented through two different methodological active learning strategies: Simulation as cooperative learning strategy and Problem-based Learning as a collaborative one. The different number of participants per group (eighty-five and sixty-five, respectively) as well as the use of two active learning strategies, either collaborative or cooperative, did not show differences in the results from a qualitative perspective.