857 resultados para Critical Approach


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The first motivation for this note is to obtain a general version of the following result: let E be a Banach space and f : E → R be a differentiable function, bounded below and satisfying the Palais-Smale condition; then, f is coercive, i.e., f(x) goes to infinity as ||x|| goes to infinity. In recent years, many variants and extensions of this result appeared, see [3], [5], [6], [9], [14], [18], [19] and the references therein. A general result of this type was given in [3, Theorem 5.1] for a lower semicontinuous function defined on a Banach space, through an approach based on an abstract notion of subdifferential operator, and taking into account the “smoothness” of the Banach space. Here, we give (Theorem 1) an extension in a metric setting, based on the notion of slope from [11] and coercivity is considered in a generalized sense, inspired by [9]; our result allows to recover, for example, the coercivity result of [19], where a weakened version of the Palais-Smale condition is used. Our main tool (Proposition 1) is a consequence of Ekeland’s variational principle extending [12, Corollary 3.4], and deals with a function f which is, in some sense, the “uniform” Γ-limit of a sequence of functions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We prove some multiplicity results concerning quasilinear elliptic equations with natural growth conditions. Techniques of nonsmooth critical point theory are employed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer networks are a critical factor for the performance of a modern company. Managing networks is as important as managing any other aspect of the company’s performance and security. There are many tools and appliances for monitoring the traffic and analyzing the network flow security. They use different approaches and rely on a variety of characteristics of the network flows. Network researchers are still working on a common approach for security baselining that might enable early watch alerts. This research focuses on the network security models, particularly the Denial-of-Services (DoS) attacks mitigation, based on a network flow analysis using the flows measurements and the theory of Markov models. The content of the paper comprises the essentials of the author’s doctoral thesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The sharing of near real-time traceability knowledge in supply chains plays a central role in coordinating business operations and is a key driver for their success. However before traceability datasets received from external partners can be integrated with datasets generated internally within an organisation, they need to be validated against information recorded for the physical goods received as well as against bespoke rules defined to ensure uniformity, consistency and completeness within the supply chain. In this paper, we present a knowledge driven framework for the runtime validation of critical constraints on incoming traceability datasets encapuslated as EPCIS event-based linked pedigrees. Our constraints are defined using SPARQL queries and SPIN rules. We present a novel validation architecture based on the integration of Apache Storm framework for real time, distributed computation with popular Semantic Web/Linked data libraries and exemplify our methodology on an abstraction of the pharmaceutical supply chain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Report published in the Proceedings of the National Conference on "Education and Research in the Information Society", Plovdiv, May, 2015

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We analyze a business model for e-supermarkets to enable multi-product sourcing capacity through co-opetition (collaborative competition). The logistics aspect of our approach is to design and execute a network system where “premium” goods are acquired from vendors at multiple locations in the supply network and delivered to customers. Our specific goals are to: (i) investigate the role of premium product offerings in creating critical mass and profit; (ii) develop a model for the multiple-pickup single-delivery vehicle routing problem in the presence of multiple vendors; and (iii) propose a hybrid solution approach. To solve the problem introduced in this paper, we develop a hybrid metaheuristic approach that uses a Genetic Algorithm for vendor selection and allocation, and a modified savings algorithm for the capacitated VRP with multiple pickup, single delivery and time windows (CVRPMPDTW). The proposed Genetic Algorithm guides the search for optimal vendor pickup location decisions, and for each generated solution in the genetic population, a corresponding CVRPMPDTW is solved using the savings algorithm. We validate our solution approach against published VRPTW solutions and also test our algorithm with Solomon instances modified for CVRPMPDTW.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vehicle-to-Grid (V2G) system with efficient Demand Response Management (DRM) is critical to solve the problem of supplying electricity by utilizing surplus electricity available at EVs. An incentivilized DRM approach is studied to reduce the system cost and maintain the system stability. EVs are motivated with dynamic pricing determined by the group-selling based auction. In the proposed approach, a number of aggregators sit on the first level auction responsible to communicate with a group of EVs. EVs as bidders consider Quality of Energy (QoE) requirements and report interests and decisions on the bidding process coordinated by the associated aggregator. Auction winners are determined based on the bidding prices and the amount of electricity sold by the EV bidders. We investigate the impact of the proposed mechanism on the system performance with maximum feedback power constraints of aggregators. The designed mechanism is proven to have essential economic properties. Simulation results indicate the proposed mechanism can reduce the system cost and offer EVs significant incentives to participate in the V2G DRM operation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

From 1992 to 2012 4.4 billion people were affected by disasters with almost 2 trillion USD in damages and 1.3 million people killed worldwide. The increasing threat of disasters stresses the need to provide solutions for the challenges faced by disaster managers, such as the logistical deployment of resources required to provide relief to victims. The location of emergency facilities, stock prepositioning, evacuation, inventory management, resource allocation, and relief distribution have been identified to directly impact the relief provided to victims during the disaster. Managing appropriately these factors is critical to reduce suffering. Disaster management commonly attracts several organisations working alongside each other and sharing resources to cope with the emergency. Coordinating these agencies is a complex task but there is little research considering multiple organisations, and none actually optimising the number of actors required to avoid shortages and convergence. The aim of the this research is to develop a system for disaster management based on a combination of optimisation techniques and geographical information systems (GIS) to aid multi-organisational decision-making. An integrated decision system was created comprising a cartographic model implemented in GIS to discard floodable facilities, combined with two models focused on optimising the decisions regarding location of emergency facilities, stock prepositioning, the allocation of resources and relief distribution, along with the number of actors required to perform these activities. Three in-depth case studies in Mexico were studied gathering information from different organisations. The cartographic model proved to reduce the risk to select unsuitable facilities. The preparedness and response models showed the capacity to optimise the decisions and the number of organisations required for logistical activities, pointing towards an excess of actors involved in all cases. The system as a whole demonstrated its capacity to provide integrated support for disaster preparedness and response, along with the existence of room for improvement for Mexican organisations in flood management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Napjainkban egyre nagyobb figyelem fordul a sporttevékenység társadalmi hasznosságára, ugyanakkor számos sportszervezet a túléléséért küzd. A legtöbb sportszervezetnél a sportteljesítmény mérése dominál, s viszonylag kevés figyelmet fordítanak az üzleti teljesítmény és az ezt befolyásoló tényezők mérésére, annak ellenére, hogy a vezetők tudatában vannak a sportteljesítmény és az üzleti teljesítmény kölcsönös összefüggésével. A sportteljesítményt a fogyasztói elégedettségen keresztül bevételre kell váltani, illetve biztosítani kell a finanszírozási forrásokat (gondolhatunk akár a játékosok fizetésére) a megfelelő sportteljesítmény eléréséhez. A sportszervezetek vezetőinek át kell látniuk a sportteljesítmény és az üzleti teljesítmény összefüggéseit, és azonosítaniuk kell a kritikus értékteremtő tényezőket. Ehhez adhat egy megfelelő gondolkodási keretet a Balanced Scorecard alkalmazása. Tanulmányunkban egy rövid áttekintést adunk a Balanced Scorecard alkalmazásának előnyeiről és kihívásairól a non-profit szervezetek esetében, majd egy magyarországi kézilabda klubnál készített esettanulmány segítségével megvizsgáljuk a BSC alkalmazási lehetőségét a sportkluboknál. _____ While performance measurement in the sport industry has a traditionally strong focus on sports results, commercial success of sports clubs needs to gain more ground. Sports results should generate market revenues, by satisfying customer needs, and allow continued investment in the further improvement of sports success (i.e. more funds for player transfers and wages). Club managers need to understand the complex relationship between on-field and off-the-field success, and identify critical success factors for achieving strategic objectives. The Balanced Scorecard approach provides a plausible framework for such analysis. Our paper explains the challenges of and opportunities for implementing a Balanced Scorecard system in non-profit organisations, and provides insights into its application in professional sport through an in-depth case study of a handball club in Hungary. We conclude by providing a model for managing sports organisations in line with strategic objectives, balancing out stakeholder expectations for both sports results and commercial success.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The tragic events of September 11th ushered a new era of unprecedented challenges. Our nation has to be protected from the alarming threats of adversaries. These threats exploit the nation's critical infrastructures affecting all sectors of the economy. There is the need for pervasive monitoring and decentralized control of the nation's critical infrastructures. The communications needs of monitoring and control of critical infrastructures was traditionally catered for by wired communication systems. These technologies ensured high reliability and bandwidth but are however very expensive, inflexible and do not support mobility and pervasive monitoring. The communication protocols are Ethernet-based that used contention access protocols which results in high unsuccessful transmission and delay. An emerging class of wireless networks, named embedded wireless sensor and actuator networks has potential benefits for real-time monitoring and control of critical infrastructures. The use of embedded wireless networks for monitoring and control of critical infrastructures requires secure, reliable and timely exchange of information among controllers, distributed sensors and actuators. The exchange of information is over shared wireless media. However, wireless media is highly unpredictable due to path loss, shadow fading and ambient noise. Monitoring and control applications have stringent requirements on reliability, delay and security. The primary issue addressed in this dissertation is the impact of wireless media in harsh industrial environment on the reliable and timely delivery of critical data. In the first part of the dissertation, a combined networking and information theoretic approach was adopted to determine the transmit power required to maintain a minimum wireless channel capacity for reliable data transmission. The second part described a channel-aware scheduling scheme that ensured efficient utilization of the wireless link and guaranteed delay. Various analytical evaluations and simulations are used to evaluate and validate the feasibility of the methodologies and demonstrate that the protocols achieved reliable and real-time data delivery in wireless industrial networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The span of control is the most discussed single concept in classical and modern management theory. In specifying conditions for organizational effectiveness, the span of control has generally been regarded as a critical factor. Existing research work has focused mainly on qualitative methods to analyze this concept, for example heuristic rules based on experiences and/or intuition. This research takes a quantitative approach to this problem and formulates it as a binary integer model, which is used as a tool to study the organizational design issue. This model considers a range of requirements affecting management and supervision of a given set of jobs in a company. These decision variables include allocation of jobs to workers, considering complexity and compatibility of each job with respect to workers, and the requirement of management for planning, execution, training, and control activities in a hierarchical organization. The objective of the model is minimal operations cost, which is the sum of supervision costs at each level of the hierarchy, and the costs of workers assigned to jobs. The model is intended for application in the make-to-order industries as a design tool. It could also be applied to make-to-stock companies as an evaluation tool, to assess the optimality of their current organizational structure. Extensive experiments were conducted to validate the model, to study its behavior, and to evaluate the impact of changing parameters with practical problems. This research proposes a meta-heuristic approach to solving large-size problems, based on the concept of greedy algorithms and the Meta-RaPS algorithm. The proposed heuristic was evaluated with two measures of performance: solution quality and computational speed. The quality is assessed by comparing the obtained objective function value to the one achieved by the optimal solution. The computational efficiency is assessed by comparing the computer time used by the proposed heuristic to the time taken by a commercial software system. Test results show the proposed heuristic procedure generates good solutions in a time-efficient manner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation develops an image processing framework with unique feature extraction and similarity measurements for human face recognition in the thermal mid-wave infrared portion of the electromagnetic spectrum. The goals of this research is to design specialized algorithms that would extract facial vasculature information, create a thermal facial signature and identify the individual. The objective is to use such findings in support of a biometrics system for human identification with a high degree of accuracy and a high degree of reliability. This last assertion is due to the minimal to no risk for potential alteration of the intrinsic physiological characteristics seen through thermal infrared imaging. The proposed thermal facial signature recognition is fully integrated and consolidates the main and critical steps of feature extraction, registration, matching through similarity measures, and validation through testing our algorithm on a database, referred to as C-X1, provided by the Computer Vision Research Laboratory at the University of Notre Dame. Feature extraction was accomplished by first registering the infrared images to a reference image using the functional MRI of the Brain’s (FMRIB’s) Linear Image Registration Tool (FLIRT) modified to suit thermal infrared images. This was followed by segmentation of the facial region using an advanced localized contouring algorithm applied on anisotropically diffused thermal images. Thermal feature extraction from facial images was attained by performing morphological operations such as opening and top-hat segmentation to yield thermal signatures for each subject. Four thermal images taken over a period of six months were used to generate thermal signatures and a thermal template for each subject, the thermal template contains only the most prevalent and consistent features. Finally a similarity measure technique was used to match signatures to templates and the Principal Component Analysis (PCA) was used to validate the results of the matching process. Thirteen subjects were used for testing the developed technique on an in-house thermal imaging system. The matching using an Euclidean-based similarity measure showed 88% accuracy in the case of skeletonized signatures and templates, we obtained 90% accuracy for anisotropically diffused signatures and templates. We also employed the Manhattan-based similarity measure and obtained an accuracy of 90.39% for skeletonized and diffused templates and signatures. It was found that an average 18.9% improvement in the similarity measure was obtained when using diffused templates. The Euclidean- and Manhattan-based similarity measure was also applied to skeletonized signatures and templates of 25 subjects in the C-X1 database. The highly accurate results obtained in the matching process along with the generalized design process clearly demonstrate the ability of the thermal infrared system to be used on other thermal imaging based systems and related databases. A novel user-initialization registration of thermal facial images has been successfully implemented. Furthermore, the novel approach at developing a thermal signature template using four images taken at various times ensured that unforeseen changes in the vasculature did not affect the biometric matching process as it relied on consistent thermal features.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

How might education professors disrupt traditional curriculum and teaching practices that teach future teachers to label, segregate, and marginalize students with disabilities? The Disability Studies in Education (DSE) approach grounds practice on the perspectives of people with disabilities and challenges practices that isolate and de-humanize individuals. The pedagogy for eliciting critical book reviews using a DSE perspective is described.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For the last three decades, the Capital Asset Pricing Model (CAPM) has been a dominant model to calculate expected return. In early 1990% Fama and French (1992) developed the Fama and French Three Factor model by adding two additional factors to the CAPM. However even with these present models, it has been found that estimates of the expected return are not accurate (Elton, 1999; Fama &French, 1997). Botosan (1997) introduced a new approach to estimate the expected return. This approach employs an equity valuation model to calculate the internal rate of return (IRR) which is often called, 'implied cost of equity capital" as a proxy of the expected return. This approach has been gaining in popularity among researchers. A critical review of the literature will help inform hospitality researchers regarding the issue and encourage them to implement the new approach into their own studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As researchers and practitioners move towards a vision of software systems that configure, optimize, protect, and heal themselves, they must also consider the implications of such self-management activities on software reliability. Autonomic computing (AC) describes a new generation of software systems that are characterized by dynamically adaptive self-management features. During dynamic adaptation, autonomic systems modify their own structure and/or behavior in response to environmental changes. Adaptation can result in new system configurations and capabilities, which need to be validated at runtime to prevent costly system failures. However, although the pioneers of AC recognize that validating autonomic systems is critical to the success of the paradigm, the architectural blueprint for AC does not provide a workflow or supporting design models for runtime testing. ^ This dissertation presents a novel approach for seamlessly integrating runtime testing into autonomic software. The approach introduces an implicit self-test feature into autonomic software by tailoring the existing self-management infrastructure to runtime testing. Autonomic self-testing facilitates activities such as test execution, code coverage analysis, timed test performance, and post-test evaluation. In addition, the approach is supported by automated testing tools, and a detailed design methodology. A case study that incorporates self-testing into three autonomic applications is also presented. The findings of the study reveal that autonomic self-testing provides a flexible approach for building safe, reliable autonomic software, while limiting the development and performance overhead through software reuse. ^