Artigos
URI permanente para esta coleçãohttps://repositorio.fei.edu.br/handle/FEI/792
Navegar
73 resultados
Resultados da Pesquisa
- Architectural design group decision-making in agile projects(2017-06-23) LOPES, S. V. F.; Plinio Thomaz Aquino Junior© 2017 IEEE.Software architecture has many definitions. One widely accepted definition of software architecture is that it is a composition of a set of architectural design decisions. Hence, designing a software architecture is a decision-making process. Agile methods drastically changed the way of designing a software architecture. In projects using agile methods (e.g. Scrum), making architectural design decisions is not the responsibility of a single person, but rather the whole development team. Despite the popularity of such methods in the industry, little research exists on how to make these decisions from the perspective of a group effectively. Current techniques usually focus on the identification of quality attributes and design alternatives, not addressing the whole decision-making process. The quality of decisions directly reflects the quality of the software architecture. Therefore poor decisions lead to bad software architectures. In this paper, we discuss current research on group decision-making in software architecture and the proposal of a combination of concepts from two architecture definition methods into a single approach that can be used in agile projects and addresses the most critical concerns of group decision-making. This proposal is part of a master's research project.
- Comparison of Bio-Inspired Algorithms from the Point of View of Medical Image Segmentation(2018) Wachs Lopes G.A.; Beltrame F.S.; Santos R.M.; Rodrigues P.S.© 2018 IEEE.As new technological challenges depending on the computational performance of bio-inspired algorithms emerge, the demand for more efficient heuristic solutions grows up at same rate. Specifically, the medical field is one of the most challenging, due to the fact of the pre-processing steps, such as multilevel segmentation of color spaces, require greater precision. Thus, many algorithms inspired by natural behavior have emerged successfully aiming to find approximate solutions compatible with optimal ones, but with much higher performance in terms of computational time. Although they perform well, some of these newer algorithms have not yet been analyzed from their practical applicability in one or more medical databases. This paper presents a comparative study from a practical point of view of three of these new algorithms: Cuckoo Search (CS), KH (Krill Herd) and EHO (Elephant Herd Optimization). Our results suggest that these three algorithms are compatible in terms of performance in medical databases, but with EHO showing the best performance among all three.
- Synthesizing 3D face shapes using tensor-based multivariate statistical discriminant methods(2011-11-14) MONOI, J.-L.; Plinio Thomaz Aquino Junior; GILLIES, D. F.We have implemented methods to reconstruct and model 3D face shapes and to synthesize facial expressions from a set of real human 3D face surface maps. The method employed tensor-based statistical shape modelling and statistical discriminant modelling methods. In the statistical shape modelling approach, new face shapes are created by moving the surface points along the appropriate expressive direction in the training set space. In the statistical discriminant model, new face shapes, such as facial expressions, can be synthesized by moving the surface points along the most discriminant direction found from the classes of expressions in the training set. The advantage of the tensor-based statistical discriminant analysis method is that face shapes of varying degrees can be generated from a small number of examples available in the 3D face shape datasets. The results of the reconstructions and synthesis of three-dimensional faces are illustrated in the paper. © 2011 Springer-Verlag.
- CARES 3.0: A two stage system combining feature-based recognition and edge-based segmentation for CIMT measurement on a multi-institutional ultrasound database of 300 images(2011-08-30) MOLINARI, F.; MELBURGER, K. M.; ACHARYA, U. R.; ZENG, G.; Paulo Rodrigues; SABA, L.; NICOLAIDES, A.; SURI, J. S.The intima-media thickness of the carotid artery (CIMT) is a validated marker of atherosclerosis. Accurate CIMT measurement can be performed by specifically designed computer algorithms. We improved a previous CIMT measurement technique by introducing a smart heuristic search for the lumen-intima (LI) and media-adventitia (MA) interfaces of the carotid distal wall. We called this new release as CARES 3.0 (a class of AtheroEdge™ system, a patented technology from Global Biomedical Technologies, Inc., CA, USA). CARES 3.0 is completely automated and adopts an integrated approach for carotid location in the image frame, followed by segmentation based on edge snapper and heuristic search. CARES 3.0 was benchmarked against three other techniques on a 300 image multi-institutional database. One of the techniques was user-driven. The CARES 3.0 CIMT measurement bias was -0.021±0.182 mm, which was better than that of the semi automated method (-0.036±0.183 mm). CARES 3.0 outperformed the other two fully automated methods. The Figure-of-Merit of CARES 3.0 was 97.4%, better than that of the semi-automated technique (95.4%). © 2011 IEEE.
- On source code completion assistants and the need of a context-aware approach(2017-07-09) ARREBOLA, F. V.; Plinio Thomaz Aquino Junior© Springer International Publishing AG 2017.Source code completion assistance is a popular feature in modern IDEs. However, despite their popularity, there is little research about their key characteristics and limitations. There is also little research about the way software developers interact with code completion assistants, especially when considering the different techniques assistants use to populate the list of possible completions. This paper presents a study about the features of currently available code assistants and an experiment targeting professional Java developers familiar with the Eclipse platform that aims to collect and interpret usage data of two popular code completion assistants during the execution of three programming tasks. Results indicate that half the interactions with code assistants are either dismissed, interrupted or the completion proposals displayed have no direct contribution to the completion of the programming task. In that sense, we argue that code assistants still have a long road to pursue, since they seem to diminish the importance of the ultimate goals of the task at hand and also lack the ability of identifying and exploring the concepts of context-aware computing theory. The results of this paper can drive future HCI research to the design of adaptive code completion assistants that are able to respond to end user behaviors and preferences.
- A reference process to design information systems for sustainable design based on LCA, PSS, social and economic aspects(2010) SANTAMA, F. S.; BARBERATO, C.; SARAIVA, A. M.© IFIP International Federation for Information Processing 2010.The purpose of Sustainable Design, SD, is to satisfy customer needs while reducing environmental impacts. The main challenge is to integrate Life Cycle Assessment, Product Service Systems, social and economic aspects while considering the tensions and trade-offs of each activity in depth. SD requires data from many sources in addition to many software tools to perform each analysis. In order to provide information systems for SD, the adoption of a Service-Oriented Architecture, SOA, is appropriate because of its integration requirements. SOA best practices recommend the design of a reference process prior to architectural definitions, so as to identify the complexities and provide a comprehensive solution to the problem. A reference process is presented here as the first step for building information systems for SD. In addition, the reference process presents a list of activities to be performed during the design stage and is very helpful as a guide for SD beginners.
- Fuzzy decision tree applied to defects classification of glass manufacturing using data from a glass furnace model(2012) COSTA, H. R.N.; LA NEVE, A.Fuzzy Decision Tree (FID 3.4) and other algorithms were used for the classification of the defects that occur in the production process of glass for packing. In this study we used the Project "Newglass" installed in Portugal. This project used a model of Manufactures to study the process of manufacturing glass packaging. The database Project "Newglass" consists of the operating variables of the furnace and the percentage of defects found in end products of the factory model. The classification obtained through the Fuzzy Decision Tree was compared with the results obtained in the manufacture of glass for packing. The classifications obtained in the manufacture and in the FID 3.4 software were also compared with the classification obtained with CART (Classification and Regression Tree) and Artificial Neural Network (ANN) algorithms. © 2012 IEEE.
- A new approach based on computer vision and non-linear Kalman filtering to monitor the nebulization quality of oil flames(2013-09-15) FLEURY, A. T.; TRIGO, F. C.; MARTINS, F. P. R.The nebulization quality of oil flames, an important characteristic exhibited by combustion processes of petroleum refinery furnaces, is mostly affected by variations on the values ofthe vapor flow rate (VFR). Expressive visual changes in the flame patterns and decay of the combustion efficiency are observed when the process is tuned by diminishing the VFR. Such behavior is supported by experimental evidence showing that too low values of VFR and solid particulate material rate increase are strongly correlate d. Given the economical importance of keeping this parameter under control, a laborator ial vertical furnace was devised with the purpose of carrying out experiments to prototype acomputer vision system capable of estimati ng VFR values through the examination of test charact eristic vectors based on geometric properties of the grey level histogram of instantaneous flame images. Firstly, atraining set composed of feature vectors from all the images collected during experiments with a priori known VFR values are properly organized and analgorithm is applied to this data in order to generate a fuzzy measurement vector whose components represent membership degrees to the 'high nebulization quality'fuzzy set. Fuzzy classification vectors from images with unknown a priori VFR values are, then, assumed tobe state-vectors inarandom-walk model, and a non-linear Tikhonov regularized Kalman filter is applied to estimate the state and the corresponding nebulization quality. The successful validation of the output data, even based onsmall training data sets, indicates that the proposed approach could beapplied to synthesize a real-time algorithm for evaluating the nebulization quality of combustion processes in petroleum refinery furnaces that use oil flamesasthe heating source. © 2013 Elsevier Ltd. All rights reserved.
- Comparative study of self-heating effects influence on triple-gate FinFETs fabricated on bulk, SOI and modified substrates(2013-09-02) D'ANGELO, R.; AGOPIAN, P. G. D.This work presents a comparative study of the self-heating effects (SHE) influence on FinFET performance for four different substrates: Bulk, SOI, SDSOI and MSDSOI. The analysis was based on tridimensional numerical simulations and focuses mainly on the analog parameters. Although SOI FinFET devices usually present better performance than the others, when the self-heating was taking into consideration, they showed a degradation of the drain current (I DS) level resulting in a negative slope of IDS and consequently a negative output conductance precluding the intrinsic voltage gain analysis. It is demonstrated that, MSDSOI structure is the most optimized structure for analog applications varying the access window dimension depending on the gate and drain bias. © 2013 IEEE.
- Automatic interface optimization through random exploration of available elements(2013-08-05) FERREIRA, L. A.; MASIERO, A. A.; Plinio Thomaz Aquino Junior; DA COSTA BIACHI, R. A.The Keystroke-Level Model (KLM) is an interface evaluation method that use as metric the time needed to perform an executed action to complete a given task. The description used in KLM is very similar to the formalism that Markov Decision Process (MDP) uses to describe a domain, in which an artificial agent must perform a sequence of actions in order to solve a problem. This work presents a way to model a user's interaction with an interface using MDP combined with KLM in order to optimize a set of parameters and find the best set of interface components for a user. Results show that by changing the metrics of the KLM, the MDP finds different solutions that may be combined to generate an interface tailored for a given user. © 2013 ACM.