IPL auction: Complete players' list and their base price
The IPL 2018 auction will take place in Bengaluru on January 27 and 28 where a total of 578 players will go under the hammer.
A fierce bidding war is expected as Indian stars Gautam Gambhir, off-spinners Ravichandran Ashwin and Harbhajan Singh, Ajinkya Rahane, mystery spinner Kuldeep Yadav and openers KL Rahul and Murali Vijay will be on the franchises’ radar.
The list also includes the overseas players, including Glenn Maxwell, Chris Gayle, Shane Watson, Rashid Khan and Eoin Morgan.
In the 10th season, Mumbai Indians led by Rohit Sharma defeated Pune to lift the IPL trophy for the third time.
The 11th season of the much-awaited Indian Premier League will begin on 6th April 2018.
The opening ceremony of the tournament will take place on April 6 in Mumbai while the first match of the new season will be played at the same venue on April 7.
He further informed that the tournament will continue until May 27 with Mumbai once again hosting the final match.
Here is the complete list of players and their base price.
Base Price ₹2,00,00,000
Batsman
KL Rahul, Murali Vijay, Brendon McCullum, Eoin Morgan, Cameron White, Chris Lynn, Colin Ingram
Bowler
Josh Hazlewood, Rashid Khan Arman, Karn Sharma, Yuzvendra Singh Chahal, Mitchell Johnson, Liam Plunkett, Pat Cummins
Wicket Keeper
Quinton De Kock, Dinesh Karthik, Robin Uthappa
All-Rounder
James Faulkner, Marcus Stoinis, Chris Woakes, Angelo Mathews, David Willey, Corey Anderson, Kedar Jadhav
==========
Base Price ₹1,50,00,000
Batsman
Aaron Finch, Jason Roy, Hashim Amla, Evin Lewis, Travis Head, Shaun Marsh, Michael Klinger, Lendl Simmons, David Miller
Bowler
Kagiso Rabada, Trent Boult, Kyle Abbott, Kuldeep Singh Yadav, Nathan Coulter-Nile, Amit Mishra, Mohit Sharma, Nathan Lyon, Steven Finn, Harry Gurney, Mark Wood, Jaydev Unadkat
Wicket Keeper
Jonny Bairstow, Jos Buttler, Peter Handscomb
All-Rounder
Moises Henriques, Ravi Bopara, Jason Holder, Moeen Ali, M.S. Washington Sundar
==========
Base Price ₹1,00,00,000
Batsman
Manish Pandey, Dwayne Smith, Alex Hales
Bowler
Tymal Mills, Andrew Tye, Mohammed Siraj, Adam Zampa, Mohammad Shami, Dale Steyn, Mustafizur Rahman, Samuel Badree, Imran Tahir, Tim Southee, Jason Behrendorff, Mitchell McClenaghan, Lasith Malinga, Ranganath Vinay Kumar, Umesh Yadav, Piyush Chawla
Wicket Keeper
Parthiv Patel, Wriddhiman Saha, Sanju Samson, Sam Billings
All-Rounder
Daniel Christian, Carlos Brathwaite, Ben Cutting, Jean-Paul Duminy, Shane Watson, Chris Jordan, Tom Curran
==========
Base Price ₹75,00,000
Batsman
Martin Guptill, Darren Bravo, Cheteshwar Pujara, Ross Taylor, Usman Khawaja
Bowler
Peter Siddle, Jerome Taylor, Lockie Ferguson, Morne Morkel, Ishant Sharma, Shardul Narendra Thakur, Adam Milne, Marchant De Lange
Wicket Keeper
Naman Ojha, Johnson Charles, Luke Ronchi
All-Rounder
Darren Sammy, Colin De Grandhomme, Yusuf Pathan, Adil Rashid, Joe Denly, Samit Patel, Wayne Parnell
==========
Base Price ₹50,00,000
Batsman
Reeza Hendricks, Mandeep Hardev Singh, Anton Devcich, Upul Tharanga, Karun Nair, Billy Stanlake, Joe Burns, Manoj Tiwary, Saurabh Tiwary, Tamim Khan, Aiden Markram, Faiz Fazal, Abhinav Mukund, Venugopal Rao, Dean Elgar, Najibullah Zadran
Bowler
Ben Laughlin, Ronsford Beaton, Dhawal Kulkarni, Sandeep Sharma, Gulbadin Naib, Ish Sodhi, Duanne Olivier, Michael Beer, Sachithra Senanayaka, Dawlat Zadran, Aaron Phangiso, Beuran Hendricks, Lakshan Sandakan, Aravind Sreenath, Barinder Singh Sran, Sean Abbott, Ben Wheeler, Kesrick Williams, Lungisani Ngidi, Ashoke Dinda, Praveen Kumar, Mujeeb Zadran, Pragyan Ojha, Jhye Richardson, Rahul Sharma, Joel Paris, Varun Aaron, Parvinder Awana, Munaf Patel, Scott Boland, Dushmantha Chameera, Shannon Gabriel, Akila Dhananjaya, Keshav Maharaj, Dane Paterson, Ben Hilfenhaus, Seth Rance, Fawad Ahmed, Tabrez Shamsi, Neil Wagner, Shapoor Zadran, Abhimanyu Mithun, Sheldon Cottrell, Matt Henry, Nuwan Kulasekara, Suranga Lakmal, Manpreet Gony, Pankaj Singh, Sudeep Tyagi
Wicket Keeper
Glenn Phillips, Denesh Ramdin, Niroshan Dickwella, Kusal Janith Perera, Nicolas Pooran, Alex Carey, Chadwick Walton, Tom Latham, M Shahzad Mohammadi, Shafiqullah Shafaq, Ambati Rayudu
All-Rounder
Gurkeerat Singh Mann, John Hastings, Sikandar Butt, Graeme Cremer, Rishi Dhawan, Solomon Mire, Ryan McLaren, Parveez Rasool, Shabbir Rahaman, Vernon Philander, Abul Raju, Paul Stirling, Malcolm Waller, Dilshan Munaweera, Thisara Perera, Pawan Negi, Seekkuge Prasanna, Ashton Agar, Mohammad Nabi, Rahmat Shah Zarmatai, Dwaine Pretorius, David Wiese, Asela Gunarathna, Dhananjaya Silva, Andile Phehlukwayo, Jonathan Carter, Rovman Powell, Mitchell Santner, Jayant Yadav, Irfan Pathan, Marlon Samuels, Andre Fletcher, Stuart Binny, Hilton Cartwright, Dasun Shanka, Dawid Malan, Farhaan Behardien, Jon-Jon Trevor Smuts, Ashley Nurse, Scott Kuggeleijn, Robbie Frylinck, Wiaan Mulder, Colin Munro, Vaughn Van Jaarsveld, Rayad Emrit, Mohammad Mahmudullah, Isuru Udana
==========
Base Price ₹40,00,000
Batsman
Tom Cooper
Bowler
Thomas Helm, Mitchell Swepson, Shahbaz Nadeem, T Natarajan
Wicket Keeper
Ishan Kishan
All-Rounder
Rajat Bhatia, Kevon Cooper, Vijay Shankar, Krunal Pandya, Deepak Hooda, Michael Neser, Jofra Archer
==========
Base Price ₹30,00,000
Batsman
Suryakumar Yadav, Christiaan Jonker, Vishnu Solanki, Alex Ross, Daniel Hughes
Bowler
Iqbal Abdullah, Siddarth Kaul, Anureet Singh, Pradeep Sangwan, Basil Thampi, Gurvinder Singh, Aniket Choudhary, Ankit Singh Rajpoot
Wicket Keeper
Ben McDermott
All-Rounder
Cameron Delport, Javon Searless, Roshon Primus
==========
Base Price ₹20,00,000
Batsman
Manprit Juneja, Mayank Siddana, Armaan Jaffer, Shivam Chauhan, Sachin Baby, Prithvi Shaw, Ankeet Bawane, Siddhesh Dinesh Lad, Apoorv Vijay Wankhade, Virat Singh, Marcus Harris, Ricky Bhui, Rassie Van der Dussen, Rajesh Bishnoi Sr, Paras Dogra, D.B Ravi Teja, Paul Valthaty, Amandeep Khare, Rinku Singh, Tanmay Agarwal, Ankit Lamba, Sarthak Ranjan, Priyank Panchal, Pratham Singh, Ishank Jaggi, Manjot Kalra, Anmolpreet Singh, Ruturaj Gaikwad, Sharad Lumba, Shubham Singh Rohilla, Himanshu Rana, Akshath Reddy, R Samarth, Mohammed Asaduddin, Abhinav Manohar, Rohan Marwaha, Rajat Patidar, Yash Sehrawat, Ravi Chauhan, Samit Gohil, Ramandeep Singh, Abhijeet Tomar, Jiwanjot Singh Chauhan, Abhimanyu Easwaran, Chirag Gandhi, Shubman Gill, Rahul Tripathi, Manan Vohra, Mayank Agarwal, Unmukt Chand
Bowler
Syed Khaleel Ahmed, Nidheesh M D Dinesan, Junior Dala, Karan Thakur, Anurag Verma, Lizaad Williams, Tanveer Ulhaq, Kushang Patel, Shelly Shaurya, A. Aswin Crist, Aaron Summers, Royston Dias, Kartik Tyagi, Tejas Singh Baroka, Abu Nechim Ahmed, Rahul Shukla, Bhargav Bhatt, Shadab Jakati, Sarabjit Ladda, Pravin Tambe, Ben Dwarshuis, Ajit Chahal, Deepak Chaudhary, Pradeep Dadhe, Domnic Joseph Muthuswamy, Babasafi Pathan, Monu Singh, Pradeep Thippeswamy, Kuldip Yadav, Krishnappa Gowtham, K.C. Cariappa, Mihir Hirwani, Akshay Wakhare, Manjeetkumar Chaudhary, Kulwant Khejroliya, Lukman Iqbal Meriwala, Navdeep Saini, Vikas Tokas, Yuvraj Chudasama, Rahul Chahar, Ronit More, Veer Pratap Singh, Varun Khanna, Pawan Suyal, Sandeep Warrier, J Suchith, Ashish Hooda, R. Sai Kishore, Rahil S Shah, Harmeet Singh, Ishwar Chaudhary, Parikshit Valsangkar, Avesh Khan, Amit Mishra, Cheepurupalli Stephen, Rajwinder Singh, Shubek Gill, Vinay Choudhary, Mayank Markande, Zahir Khan Pakteen, Ankit Soni, Lalit Yadav, Pardeep Sahu, Chama Milind, Umar Nazir Mir, Yarra Raj, Oshane Thomas, Athisayaraj V, Zeeshan Ansari, Siddharth Desai, Jiyas K, Alexandar Rama Doss, Nathu Singh, M. Ashwin, Shivil Kaushik, Baltej Dhanda, Armaan Jain, Mohsin Khan, Mukesh Kumar Singh, Arshdeep Singh, Rishi Arothe, Asif K M, Ravi Kiran Majeti, Ishan Porel, Aditya Thakare, Sandeep Lamichhane, Subodh Bhati, Mohan Prasath, Abhishek Sakuja, Javed Khan, Ashok Sandhu, Tushar Deshpande, Sayan Ghosh, Jaskaran Singh, Prasidh Krishna, Rajneesh Gurbani
Wicket Keeper
Ankush Bains, C.M. Gautam, Aditya Tare, N Jagadeesan, Nikhil Shankar Naik, Smit Patel, K.B Arun Karthik, Kona Srikar Bharat, Shreevats Goswami, Mahesh Rawat, Gitansh Khera, Jitesh Sharma, Vishnu Vinod, Sheldon Jackson, Kedar Devdhar, Prashant Chopra, Anuj Rawat, Harvik Desai, Anmol Malhotra, Dhruv Raval, Rohith Ravikumar, Mohammad Nazim Siddiqui, Mayank Sidhu, Sandeep Kumar Tomar, Sadiq Hassan Kirmani, Jaskaranvir Singh Sohi, Abhishek Gupta, Hamza Tariq, Rahul Yadav, Kyle Mayers
All-Rounder
Vyshak Vijay Kumar, Jaydev Shah, Shashank Singh, Manzoor Dar, Aman Khan, Diwesh Pathania, Shamss Mulani, Salman Nizar, Dafedar, Khizar Anwar, Mandeep Singh, Shubham Ranjane, Sidhant Dobal, Vinod Kumar C.V., Thomas Kaber, Midhun S, Akhil Arvind Herwadkar, Shamar Springer, Ashok Menaria, Jack Wildermuth, Odean Smith, Yogesh Nagar, Milind Kumar, Shubham Agrawal, Akshdeep Nath, Yomahesh Kumar, Vivek Singh, Mohammed Bilal, Arun Chaprana, Rajat Paliwal, Abhimanyu Rana, Sarang Rawat, Fabid, Farook Ahmed, Arjun Sharma, Shreyas Gopal, Akash Sudan, Sandeep Bavanaka, Karan Kaila, Aryaman Vikram Birla, Gaurav Gambir, Ankit Kaushik, Patrick Kruger, Sohraab Dhaliwal, Aditya Sarvate, Amish Sidhu, Shadley Van Schalkwyk, Vignesh Moorthy, Arjun Nair, Kanishk Seth, Shivam Dubey, Hanuma Vihari, Puneet Datey, Ninad Rathva, Siddhant Sharma, Mrinank Singh, Manan Sharma, Chintan Gaja, Amit Mishra, Jalaj Saxena, Bipul Sharma, Shreekant Wagh, Syed, Mehdi Hasan, Harshal Patel, Sumit Ruikar, Ashish Reddy, Kuldeep Hooda, Shaurya Sanandia, Vaibhav Rawal, Pankaj Jaswal, Anustup Majumdar, Dhruv Shorey, Kshitiz Sharma, Swapnil Singh, Himmat Singh, Writtick Chatterjee, Chris Green, Ryan Ninan, Rohan Prem, Rahul Tewatia, Puneed Datey, R. Sanjay Yadav, Imtiaz Ahmed, Atit Sheth, Dinesh Salunkhe, Pavan Deshpande, Shivam Sharma, Chaitanya Bishnoi, Indrajith Baba, Jatin Saxena, Shivam Mavi, Sagar Trivedi, Amit Verma, Akash Parkar, Nitish Rana, Anukul Roy, Akash Bhandari, Pratyush Singh, Ankit Sharma, Anirudha Ashok Joshi, Saurabh Kumar, Praveen Dubey, Kunal Chandela, Aamir Gani, Pulkit Narang, Riyan Parag, Karanveer Singh, Sumeet Verma, Cameron Gannon, Akshay Karnewar, Tajinder Dhillon, Govinda Poddar, Rajesh Sharma, Deepak Chahar, Antony Dhas, Kishore Pramod Kamath, Nikhil Gangta, Jay Gokul Bista, Sumanth Bodapati, Mahipal Lomror, Deepak Punia, Mayank Dagar, Kamlesh Nagarkoti, Darcy Short, Baba Aparajith, Abhishek Sharma, Milind Tandon.
]]>
4 notes
·
View notes
Top 10 Cited Papers Software Engineering & Applications Research Articles From 2017 Issue
http://www.airccse.org/journal/ijsea/vol8.html
International Journal of Software Engineering & Applications (IJSEA)
ISSN : 0975 - 9018 ( Online ); 0976-2221 ( Print )
http://www.airccse.org/journal/ijsea/ijsea.html
Citation Count – 04
Factors on Software Effort Estimation
Simon WU Iok Kuan
Faculty of Business Administration, University of Macao, Macau, China
ABSTRACT
Software effort estimation is an important process of system development life cycle, as it may affect the success of software projects if project designers estimate the projects inaccurately. In the past of few decades, various effort prediction models have been proposed by academicians and practitioners. Traditional estimation techniques include Lines of Codes (LOC), Function Point Analysis (FPA) method and Mark II Function Points (Mark II FP) which have proven unsatisfactory for predicting effort of all types of software. In this study, the author proposed a regression model to predict the effort required to design small and medium scale application software. To develop such a model, the author used 60 completed software projects developed by a software company in Macau. From the projects, the author extracted factors and applied them to a regression model. A prediction of software effort with accuracy of MMRE = 8% was constructed.
KEYWORDS
Effort Estimation, Software Projects, Software Applications, System Development Life Cycle.
For More Details : http://aircconline.com/ijsea/V8N1/8117ijsea03.pdf
Volume Link:
http://www.airccse.org/journal/ijsea/vol8.html
REFERENCES
[1] Fu, Ya-fang, Liu, Xiao-dong, Yang, Ren-nong, Du, Yi-lin and Li Yan-jie (2010), “A Software Size Estimation Method Based on Improved FPA”, Second World Congress on Software Engineering,Vol. 2, pp228-233.
[2] Hastings, T. E. & Sajeev, A. S. M. (2001), “A Vector-Based Approach to Software Size Measurement and Effort Estimation”, IEEE Transactions on Software Engineering, Vol. 27, No. 4, pp.337-350.
[3] Norris, K. P. (1971), “The Accuracy of Project Cost and Duration Estimates in Industrial R&D”, R&D Management, Vol. 2, No. 1, pp.25-36.
[4] Murmann, Philipp A. (1994), “Expected Development Time Reductions in the German Mechanical Engineering Industry”, Journal of Product innovation Management, Vol. 11, pp.236-252.
[5] David Consulting Group (2012), “Project Estimating”, DCG Corporate Office, Paoli, 2007: http:davidconsultinggroup.com/training/estimation.aspx (January, 2017)
[6] Boehm, Barry (1976), “Software Engineering”, IEEE Transactions on Computers, Vol. C-25, Issue 12, pp1226-1241.
[7] Dreger, J. B. (1989), “Function Point Analysis”, Englewood Cliffs, NJ:Prentice-Hall.
[8] Smith, Randy K., Hale, Joanne E. & Parrish, Allen S. (2001), “An Empirical Study Using Task Assignment Patterns to Improve the Accuracy of Software Effort Estimation”, IEEE Transactions on Software Engineering, Vol. 27, No. 3, pp.264- 271.
[9] Sataphthy, Shashank Mouli, Kumar, Mukesh & Rath, Santanu Kumar (2013), “Class Point Approach for Software Effort Estimation Using Soft Computing Techniques, International Conference on Advances in Computing, Communications and Informatics(ICACCI), p178-183.
[10] Tariq, Sidra, Usman, Muhammad, Wong, Raymond, Zhuang, Yan & Fong, Simon (2015), “On Learning Software Effort Estimation”, 3rd International Symposium and Business Intelligence, P79- 84.
[11] Bhandari, Sangeeta (2016), “FCM Based Conceptual Framework for Software Effort Estimation”, International Conference on Computing for Sustainable Global Development, pp2585-2588.
[12] Moharreri, Kayhan, Sapre, Alhad Vinayak, Ramanathan, Jayashree & Ramnath, Rajiv (2016), “CostEffective Supervised Learning Models for Software Effort Estimation in Agile Environments”, IEEE 40th Annual Computer Software and Applications Conference, p135-140.
[13] Mukhopadhyay, Tridas & Kekre, Sunder. (1992), “Software Effort Models for Early Estimation of Process Control Applications”, IEEE Transactions on Software Engineering, Vol. 18, No. 10, pp.915- 924.
[14] Boehm, Barry W. (1995), “Cost Models for Future Software Life Cycle Processes: COCOMO 2.0,” Anals of Software Engineering Special Volume on Software Process and Product Measurement, Science Publisher, Amsterdam, Netherlands, 1(3), p45-60.
[15] Srinivasan, Krishnamoorthy & Fisher, Douglas (1995), “Machine Learning Approaches to Estimating Software Development Effort”, IEEE Transactions on Software Engineering, Vol. 21, No. 2, pp126- 137.
[16] Strike, Kevin, Emam, Khaled EI & Madhavji, Nazim (2001), “Software Cost Estimation with Incomplete Data”, IEEE Transactions on Software Engineering, Vol. 27, No. 10, pp215-223.
[17] Putnam, Lawrence H. (1978), “A General Empirical Solution to the Macro Software Sizing and Estimating Problem”, IEEE Transactions on Software Engineering, Vol. SE-4, No. 4, pp345-361.
[18] Boehm, Barry W. (1981), “Software Engineering Economics”, Englewood Cliffs, NJ:Prentice-Hall.
[19] Subramanian, Girish H. & Breslawski, Steven (1995), “An Empirical Analysis of Software Effort Estimate Alternations”, Journal of Systems Software, Vol. 31, pp135-141.
[20] Boehm, Barry W. (1984), “Software Engineering Economics”, IEEE Transactions on Software Engineering”, Vol. 10, pp4-21.
[21] Agrawal, Priya & Kumar, Shraddha (2016), “Early Phase Software Effort Estimation Model”, Symposium on Colossal Data Analysis and Networking, pp1-8.
[22] Albrecht, Allen. J. (1979), “Measuring Application Development Productivity”, Proceedings of the IBM Applications Development Symposium, pp83-92.
[23] Albrecht, Allen J. & Gaffney. John E. (1983), “Software Function, Source Lines of Code, and Development Effort Prediction: A Software Science Validation”, IEEE Transactions on Software Engineering, Vol. 9, No.6, pp639-648.
[24] Hu, Qing, Plant, Robert & Hertz, David (1998), “Software Cost Estimation Using Economic Production Models”, Journal of Management Information Systems, Vol. 15, No. 1, pp143-163.
[25] Bock D. B., & Klepper R. (1992). FP S: A Simplified Function Point Counting Method, “The Journal of Systems Software”, 18:245 254.
[26] Kemerer, Chris F. (1993). “Reliability of Function Points Measurement: A Field Experiment”, Communications of the ACM, 36(2):85 97.
[27] Lokan, Chris J. (2000). “An Empirical Analysis of Function Point Adjustment Factors”, Journal of Information and Software Technology, vol. 42, pp649-660.
[28] Jeffery, J., Low, G. & Barnes, C. (1993), “Comparison of Function Point Counting Techniques”, IEEE Transactions on Software Engineering, Vol. 19, No. 5, pp529- 532.
[29] Misra, A. K. & Chaudhary, B. D. (1991), “An Interactive Structured Program Development Tool”, IEEE Region 10 International Conference on EC3-Energy, Computer, Communication and Control Systems, 3, 1-5.
[30] Kendall, K. E. & Kendall, J. E. (2005), “System Analysis and Design”, 6/e, Prentice- Hall.
[31] Brooks, F. (1975), “The Mythical Man-Month.” Addison-Wesley.
[32] Zhang, Xiaoni & Windsor, John (2003). “An Empirical Analysis of Software Volatility and Related Factors”, Industrial Management & Data Systems, Vol. 103, No. 4, pp275-281.
[33] Kemerer, C. F. & Slaughter, S. (1997). “Determinants of Software Maintenance Profiles: An Empirical Investigation”, Journal of Software Maintenance, Vol. 9, pp235-251.
[34] Krishnan, Mayuram S. (1998). “The Role of Team Factors in Software Cost and Quality”, Information Technology & People, Vol. 11(1), pp20-35.
[35] MacDonell, S. G., Shepperd, M. J. & Sallis, P. (1997), “Metrics for Database Systems: An Empirical Study”, Proceedings of the 4th International Software Metrics Symposium(Metrics 1997).
[36] MacDonell, S. G. (1994). “Comparative Review of Functional Complexity Assessment Methods for Effort Estimation”, Software Engineering Journal, pp107- 116.
Citation Count – 03
A Brief Program Robustness Survey
Ayman M. Abdalla, Mohammad M. Abdallah and Mosa I. Salah
Faculty of Science and I.T, Al-Zaytoonah University of Jordan,
Amman, Jordan
ABSTRACT
Program Robustness is now more important than before, because of the role software programs play in our life. Many papers defined it, measured it, and put it into context. In this paper, we explore the different definitions of program robustness and different types of techniques used to achieve or measure it. There are many papers about robustness. We chose the papers that clearly discuss program or software robustness. These papers stated that program (or software) robustness indicates the absence of ungraceful failures. There are different types of techniques used to create or measure a robust program. However, there is still a wide space for research in this area.
Keywords:
Robustness, Robustness measurement, Dependability, Correctness.
For More Details: http://aircconline.com/ijsea/V8N1/8117ijsea01.pdf
Volume Link: http://www.airccse.org/journal/ijsea/vol8.html
REFERENCES
[1] IEEE Standard Glossary of Software Engineering Terminology, 1990.
[2] J. C. Laprie, J. Arlat, C. Beounes, and K. Kanoun, "Definition and analysis of hardware- and software-fault-tolerant architectures," Computer, vol. 23, pp. 39-51, 1990.
[3] A. Avizienis, J. C. Laprie, B. Randell, and C. Landwehr, "Basic concepts and taxonomy of dependable and secure computing," Dependable and Secure Computing, IEEE Transactions on, vol. 1, pp. 11-33, 2004.
[4] W. S. Jawadekar, Software Engineering: Principles and Practice: Mcgraw Hill Higher Education, 2004.
[5] J. C. Laprie, "Dependable computing: concepts, challenges, directions," in Computer Software and Applications Conference, 2004. COMPSAC 2004.Proceedings of the 28th Annual International, 2004, p. 242 vol.1.
[6] R. S. Pressman, Software Engineering: A Practitioner's Approach, Seventh edition ed.: McGraw Hill Higher Education, 2009.
[7] I. Sommerville, Software Engineering: Addison-Wesley, 2006.
[8] D. M. John, I. Anthony, and O. Kazuhira, Software reliability: measurement, prediction, application: McGraw-Hill, Inc., 1987.
[9] L. L. Pullum, Software fault tolerance techniques and implementation: Artech House, Inc., 2001.
[10] D. G. Steven, "Robustness in Complex Systems," presented at the Proceedings of the Eighth Workshop on Hot Topics in Operating Systems, 2001.
[11] G. M. Weinberg. (1983) Kill That Code! Infosystems.48-49.
[12] D. John and Philip J. Koopman, Jr., "Robust Software - No More Excuses," presented at the Proceedings of the 2002 International Conference on Dependable Systems and Networks, 2002.
[13] D. Frank, Z. Nickolai, K. Frans, M. David, res, and M. Robert, "Event-driven programming for robust software," presented at the Proceedings of the 10th workshop on ACM SIGOPS European workshop, Saint-Emilion, France, 2002.
[14] Y. Bi, J. Yuan, and Y. Jin, "Beyond the Interconnections: Split Manufacturing in RF Designs," Electronics, vol. 4, p. 541, 2015.
[15] Y. Bi, X. S. Hu, Y. Jin, M. Niemier, K. Shamsi, and X. Yin, "Enhancing Hardware Security with Emerging Transistor Technologies," presented at the Proceedings of the 26th edition on Great Lakes Symposium on VLSI, Boston, Massachusetts, USA, 2016.
[16] Y. Bi, K. Shamsi, J.-S.Yuan, P.-E.Gaillardon, G. D. Micheli, X. Yin, X.S. Hu, M. Niemier,
Y. Jin, "Emerging Technology-Based Design of Primitives for Hardware Security," J. Emerg. Technol. Comput. Syst., vol. 13, pp. 1-19, 2016.
[17] M. Rebaudengo, M. S. Reorda, M. Torchiano, and M. Violante, "Soft-Error Detection through Software Fault-Tolerance Techniques," in IEEE International Symposium on Defect and Fault-Tolerance in VLSI Systems, 1999.
[18] R. L. Michael, H. Zubin, K. S. S. Sam, and C. Xia, "An Empirical Study on Testing and Fault Tolerance for Software Reliability Engineering," presented at the Proceedings of the 14th International Symposium on Software Reliability Engineering, 2003.
[19] N. H. Michael and T. H. Vance, "Robust Software," IEEE Internet Computing, vol. 6, pp. 80-82, 2002.
[20] M. Dix and H. D. Hofmann, "Automated software robustness testing - static and adaptive test case design methods," in Euromicro Conference, 2002.Proceedings.28th, 2002, pp. 62- 66.
[21] N. H. Michael, T. H. Vance, and G. Rosa Laura Zavala, "Robust software via agent-based redundancy," presented at the Proceedings of the second international joint conference on Autonomous agents and multiagent systems, Melbourne, Australia, 2003.
[22] T. Rajesh and N. H. Michael, "Multiagent Reputation Management to Achieve Robust Software Using Redundancy," presented at the Proceedings of the IEEE/WIC/ACM International Conference on Intelligent Agent Technology, 2005.
[23] V. T. Holderfield and M. N. Huhns, "A Foundational Analysis of Software Robustness Using Redundant Agent Collaboration," in Agent Technologies, Infrastructures, Tools, and Applications for E-Services.vol. 2592/2003, ed Berlin / Heidelberg: Springer, 2003, pp. 355- 369.
[24] R. Laddaga. (1999, May/June) Creating Robust Software through Self-Adaptation.IEEE Intelligent systems.26-30.
[25] M. K. Mieczyslaw, B. Kenneth, and A. E. Yonet, "Control Theory-Based Foundations of Self-Controlling Software," vol. 14, ed: IEEE Educational Activities Department, 1999, pp. 37-45.
[26] C. Petitpierre and A. Eliëns, "Active Objects Provide Robust Event-Driven Applications," in SERP'02, Las Vegas, 2002, pp. 253-259.
[27] G. C. Philip, "Software design guidelines for event-driven programming," Journal of Systems and Software, vol. 41, pp. 79-91, 1998.
[28] B. P. Miller, D. Koski, C. P. Lee, V. Maganty, R. Murthy, A. Natarajan, J. Steidl, "Fuzz Revisited: A Re-examination of the Reliability of UNIX Utilities and Services," Report: University of Wisconsin, 1995.
[29] M. Schmid and F. Hill, "Data Generation Techniques for Automated Software Robustness Testing," in Proceedings of the International Conference on Testing Computer Software, 1999, pp. 14-18.
[30] J. P. DeVale, P. J. Koopman, and D. J. Guttendorf, "The Ballista Software Robustness Testing Service," presented at the Tesing Computer Software Coference, 1999.
[31] P. Koopman. (2002, 2nd September). The Ballista Project: COTS Software Robustness Testing. Available: http://www.ece.cmu.edu/~koopman/ballista/index.html
[32] K. Kanoun, H. Madeira, and J. Arlat, "A Framework for Dependability Benchmarking," presented at the The International Conference on Dependable Systems and Networks, Washington, D.C., USA, 2002.
[33] A. B. Brown and P. Shum, "Measuring Resiliency of IT Systems," presented at the SIGDeB Workshop, 2005.
[34] A. B. Brown, J. Hellerstein, M. Hogstrom, T. Lau, S. Lightstone, P. Shum, M. Peterson, "Benchmarking Autonomic Capabilities: Promises and Pitfalls," in International Conference on Autonomic Computing (ICAC'04), Los Alamitos, CA, USA, 2004, pp. 266-267.
[35] H. Zuse, A Framework of Software Measurement: Walter de Gruyter, 1998.
[36] N. E. Fenton and S. L. Pfleeger, Software Metrics, A Rigorous and Practical Approach, 2 ed.: PWS Publishing Company, 1997.
[37] ISO/IEC 15939: Systems and software engineering -- Measurement process, ISO/IEC, 2007.
[38] K. Kaur, K. Minhas, N. Mehan, and N. Kakkar, "Static and Dynamic Complexity Analysis of Software Metrics," Empirical Software Engineering, vol. 56, pp. 159-161, 2009.
[39] D. M. Jones, The New C Standard: A Cultural and Economic Commentary, 1st edition ed.: Addison-Wesley Professional, 2003.
[40] International Standard ISO/IEC 9899, 1999.
[41] D. M. Jones, The New C Standard: An Economic and Cultural Commentary, 2002.
[42] C programming language coding guidelines, www.lrdev.com, 1998.
[43] M. Arup and P. S. Daniel, "Measuring Software Dependability by Robustness Benchmarking," vol. 23, ed: IEEE Press, 1997, pp. 366-378.
[44] B. Eslamnour and S. Ali, "Measuring robustness of computing systems," Simulation Modelling Practice and Theory, vol. 17, pp. 1457-1467, 2009.
[45] H. Arne, R. Razvan, and E. Rolf, "Methods for multi-dimensional robustness optimization in complex embedded systems," presented at the Proceedings of the 7th ACM & IEEE international conference on Embedded software, Salzburg, Austria, 2007.
[46] M. Abdallah, M. Munro, and K. Gallagher, "Certifying software robustness using program slicing," in 2010 IEEE International Conference on Software Maintenance, Timisoara, Romania, 2010, pp. 1-2.
Citation Count – 02
Culture Effect on Requirements Elicitation Practice in
Developing Countries
Ayman Sadig1 and Abd-El-Kader Sahraoui2 1Ahfad University for Women and SUST Khartoum Sudan
2LAAS-CNRS, Université de Toulouse, CNRS, U2J, Toulouse, France
ABSTRACT
Requirement elicitation is a very important step into developing any new application. This paper will examine the culture effect on requirement elicitation in developing countries.
This is a unique research that will look at requirement elicitation process in 10 different parts of the world including Arab word, India, China, Africa and South America. The focus is how the culture affects (RE) and makes every place has its own practice of RE. The data were collect through surveys and direct interviews. The results show astonishing culture effect on RE.
The conclusion is that culture effects deeply the technique gets chosen for requirement elicitation. If you are doing RE in Thailand, it will be very different from RE in Arab world. For example in Thailand respect for leader is critical and any questioning of manager methods will create a problem while in Arab world decision tree is favourite RE technique because visual are liked much more than documents.
KEYWORDS
Culture impact, requirement elicitation.
For More Details:http://aircconline.com/ijsea/V8N1/8117ijsea05.pdf
Volume Link: http://www.airccse.org/journal/ijsea/vol8.html
REFERENCES
[1] Lee, S.hyun. & Kim Mi Na, (2008) “This is my paper”, ABC Transactions on ECE, Vol. 10, No. 5, pp120-122.
[2] Gizem, Aksahya & Ayese, Ozcan (2009) Coomunications & Networks, Network Books, ABC Publishers.
[3] Sadiq .M and Mohd .S (2009), Article in an International journal, “Elicitationand Prioritization of Software Requirements”. Internation Journal of RecentTrends in Engineering, Vol.2, No.3, pp. 138-142.
[4] Bergey, John, et al. Why Reengineering Projects Fail. No. CMU/SEI-99-TR-010. CARNEGIE-MELLON UNIV PITTSBURGH PA SOFTWARE ENGINEERING INST, 1999.
[5] Goguen, J. A., Linde, C. (1993): Techniques for Requirements Elicitation, International Symposium on Requirements Engineering, pp. 152-164, January 4-6, San Diego, CA.
[6] Robertson, S., Robertson, J. (1999) Mastering the Requirements Process, Addison Wesley: Great Britain.
[7] Iqbal, Tabbassum, and Mohammad Suaib. "Requirement Elicitation Technique:-A Review Paper." Int. J. Comput. Math. Sci 3.9 (2014).
[8] HOFSTEDE G (1980) Culture’s Consequences: International Differences in Work- Related Values. Sage, Newbury Park, CA.
[9] SCHEIN EH (1985) Organisational Culture and Leadership. Jossey-Bass, San Francisco, CA.
[10] LYTLE AL, BRETT JM, BARSNESS ZI, TINSLEY CH and JANSSENS M (1999) A paradigm for confirmatory cross-cultural research in organizational behavior. Research in Organizational Behavior 17, 167–214, https:// lirias.kuleuven.be/handle/123456789/31199.
[11] Kluckhohn, K (1954) Culture and behavior. In G. Lindsey (ED.) handbook of social psychology
[12] TRIANDIS HC (1995) Individualism & Collectivism. Westview Press, Boulder, CO.
[13] ROKEACH M (1973) The Nature of Human Values. Free Press, New York.
[14] KARAHANNA E, EVARISTO JR and SRITE M (2005) Levels of culture and individual behavior: an integrative perspective. Journal of Global Information Management 13(2), 1–20
[15] Fernández, Daniel Méndez, and Stefan Wagner. "Naming the pain in requirements engineering: A design for a global family of surveys and first results from Germany." Information and Software Technology 57 (2015): 616-643.
[16] Davis, Gordon B. "Strategies for information requirements determination."IBM systems journal 21.1 (1982): 4-30.
[17] Rouibah, Kamel. "Social usage of instant messaging by individuals outside the workplace in Kuwait: A structural equation model." Information Technology & People 21.1 (2008): 34-68.
[18] Byrd, Terry Anthony, Kathy L. Cossick, and Robert W. Zmud. "A synthesis of research on requirements analysis and knowledge acquisition techniques."MIS quarterly (1992): 117-138.
[19] Arnott, David, Waraporn Jirachiefpattana, and Peter O'Donnell. "Executive information systems development in an emerging economy." Decision Support Systems 42.4 (2007): 2078-2084.
[20] Kontio, Jyrki, Laura Lehtola, and Johanna Bragge. "Using the focus group method in software engineering: obtaining practitioner and user experiences."Empirical Software Engineering, 2004. ISESE'04. Proceedings. 2004 International Symposium on. IEEE, 2004.
[21] Agarwal, Ritu, Atish P. Sinha, and Mohan Tanniru. "The role of prior experience and task characteristics in object-oriented modeling: an empirical study." International journal of human-computer studies 45.6 (1996): 639-667.
[22] Liu, Lin, et al. "Understanding chinese characteristics of requirements engineering." 2009 17th IEEE International Requirements Engineering Conference. IEEE, 2009.
[23] Rouibah, Kamel, and Sulaiman Al-Rafee. "Requirement engineering elicitation methods: A Kuwaiti empirical study about familiarity, usage and perceived value." Information management & computer security 17.3 (2009): 192-217.
[24] Liu, Lin, et al. "Understanding chinese characteristics of requirements engineering." 2009 17th IEEE International Requirements Engineering Conference. IEEE, 2009.
[25] Fernández, Daniel Méndez, and Stefan Wagner. "Naming the pain in requirements engineering: A design for a global family of surveys and first results from Germany." Information and Software Technology 57 (2015): 616-643.
[26] Winschiers-Theophilus, Heike, et al. "Determining requirements within an indigenous knowledge system of African rural communities." Proceedings of the 2010 Annual Research Conference of the South African Institute of Computer Scientists and Information Technologists. ACM, 2010.
[27] Mursu, Anja, et al. "Information systems development in a developing country: Theoretical analysis of special requirements in Nigeria and Africa."System Sciences, 2000. Proceedings of the 33rd Annual Hawaii International Conference on. IEEE, 2000.
[28] Anwar, Fares, and Rozilawati Razali. "A practical guide to requirements elicitation techniques selection-An empirical study." Middle-East Journal of Scientific Research 11.8 (2012): 1059-1067.
[29] HELLIWELL J, LAYARD R and SACHS J (2013) World happiness report 2013, United Nations.
[30] HOFSTEDE G andHOFSTEDE GJ (2005) Cultures and Organisations: Software of the Mind. McGraw-Hill, New York.
[31] Thanasankit, Theerasak, and Brian Corbitt. "Cultural context and its impact on requirements elicitation in Thailand." EJISDC: The Electronic Journal on Information Systems in Developing Countries 1 (2000): 2.
[32] Komin, S. (1990). Psychology of the Thai People: Values and Behavioral Patterns. Bangkok, Thailand: NIDA (National Institute of Development Administration).
[33] Khan¹, Shadab, Aruna B. Dulloo, and Meghna Verma. "Systematic review of requirement elicitation techniques." (2014).
[34] Sadig, Ayman. "Requirements Engineering Practice in Developing Countries: Elicitation and Traceability Processes." Proceedings of the International Conference on Software Engineering Research and Practice (SERP). The Steering Committee of The World Congress in Computer Science, Computer Engineering and Applied Computing (WorldComp), 2016.
[35] Wiegers, Karl, and Joy Beatty. Software requirements. Pearson Education, 2013
Citation Count – 02
A User Story Quality Measurement Model for Reducing Agile
Software Development Risk
Sen-Tarng Lai
Department of Information Technology and Management, Shih Chien University, Taipei, Taiwan
ABSTRACT
In Mobile communications age, the IT environment and IT technology update rapidly. The requirements change is the software project must face challenge. Able to overcome the impact of requirements change, software development risks can be effectively reduced. Agile software development uses the Iterative and Incremental Development (IID) process and focuses on the workable software and client communication. Agile software development is a very suitable development method for handling the requirements change in software development process. In agile development, user stories are the important documents for the client communication and criteria of acceptance test. However, the agile development doesn’t pay attention to the formal requirements analysis and artifacts tracability to cause the potential risks of software change management. In this paper, analyzing and collecting the critical quality factors of user stories, and proposes the User Story Quality Measurement (USQM) model. Applied USQM model, the requirements quality of agile development can be enhanced and risks of requirement changes can be reduced.
KEYWORDS
Agile development, user story, software project, quality measurement, USQM.
For More Details : http://aircconline.com/ijsea/V8N2/8217ijsea05.pdf
Volume Link : http://www.airccse.org/journal/ijsea/vol8.html
REFERENCES
[1] S. A. Bohner and R. S. Arnold, 1996. Software Change Impact Analysis, IEEE Computer Society Press, CA, pp. 1-26.
[2] S. A. Bohner, 2002. Software Change Impacts: An Evolving Perspective, Proc. of IEEE Intl Conf. on Software Maintenance, pp. 263-271.
[3] B. W. Boehm, 1991. Software risk management: Principles and practices, IEEE Software, 8(1), 1991, pp. 32-41.
[4] A. Cockburn, 2002. Agile Software Development, Addison-Wesley.
[5] M. Cohn and D. Ford, 2003. Introducing an Agile Process to an Organization, IEEE Computer, vol. 36 no. 6 pp. 74-78, June 2003.
[6] V. Szalvay, “An Introduction to Agile Software Development,” Danube Technologies Inc., 2004.
[7] C. Larman and V. R. Basili, 2003. Iterative and Incremental Development: A Brief History, IEEE Computer, June 2003.
[8] C. Larman, 2004. Agile and Iterative Development: A Manager's Guide, Boston: Addison Wesley.
[9] S. R. Schach, 2010. Object-Oriented Software Engineering, McGraw-Hill Companies.
[10] J. L. Eveleens and C. Verhoef, 2010. The Rise and Fall of the Chaos Report Figures,” IEEE Software, vol. 27, no. 1, pp. 30-36.
[11] The Standish group, 2009. “New Standish Group report shows more project failing and less successful projects,” April 23, 2009.
(http://www.standishgroup.com/newsroom/chaos_2009.php)
[12] B. W. Boehm, 1989. “Tutorial: Software Risk Management,” IEEE CS Press, Los Alamitos, Calif.
[13] R. Fairley,1994. “Risk management for Software Projects,” IEEE Software, vol. 11, no. 3, pp. 57-67.
[14] R. S. Pressman, 2010. Software Engineering: A Practitioner’s Approach, McGraw- Hill, New York, 2010.
[15] David S. Frankel, 2003. Model Driven Architecture: Applying MDA to Enterprise Computing, John Wiley & Sons.
[16] Mike Cohn, 2004. User Stories Applied: For Agile Software Development, Addison- Wesley Professional; 1 edition.
[17] Ron Jeffries, 2001. “Essential XP: Card, Conversation, Confirmation,” Posted on: August 30, 2001. (http://xprogramming.com/index.php)
[18] Bill Wake, 2003. “INVEST in Good Stories, and SMART Tasks,” Posted on August 17, 2003, (http://xp123.com/articles/invest-in-good-stories-and-smart-tasks/)
[19] Bill Wake, 2012. “Independent Stories in the INVEST Model,” Posted on: February 8, 2012, (http://xp123.com/articles/independent-stories-in-the-invest-model/)
[20] T. J. McCabe, 1976. A Complexity Measure, IEEE Trans. On Software Eng., Vol. 2, No 4, pp.308-320.
[21] M. H. Halstead, 1977, Elements of Software Science, North-Holland, New York.
[22] Ivar Jacobson and Pan-Wei Ng, 2004, Aspect-Oriented Software Development with Use Cases, Addison-Wesley Boston, 2004.
[23] Ralph Young, 2001, Effective Requirements Practices, Addison-Wesley, Boston, 2001.
[24] S. D. Conte, H. E. Dunsmore and V. Y. Shen, 1986. Software Engineering Metrics and Models, Benjamin/Cummings, Menlo Park.
[25] N. E. Fenton, 1991, Software Metrics - A Rigorous Approach, Chapman & Hall.
[26] D. Galin, 2004. Software Quality Assurance – From theory to implementation, Pearson Education Limited, England.
Citation Count – 19
A Survey of Verification Tools Based on Hoare Logic
Nahid A. Ali
College of Computer Science & Information Technology, Sudan University of Science & Technology, Khartoum, Sudan
ABSTRACT
The quality and the correctness of software has a great concern in computer systems. Formal verification tools can used to provide a confidence that a software design is free from certain errors. This paper surveys tools that accomplish automatic software verification to detect programming errors or prove their absence. The two tools considered are tools that based on Hoare logic namely, the KeY-Hoare and Hoare Advanced Homework Assistant (HAHA). A detailed example on these tools is provided, underlining their differences when applied to practical problems.
KEYWORDS
Hoare Logic, Software Verification, Formal Verification Tools, KeY-Hoare Tool, Hoare Advanced Homework Assistant Tool
For More Details : http://aircconline.com/ijsea/V8N2/8217ijsea06.pdf
Volume Link : http://www.airccse.org/journal/ijsea/vol8.html
REFERENCES
[1] D'silva, Vijay and Kroening, Daniel and Weissenbacher, Georg, "A survey of automated techniques for formal software verification." IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, vol. 27(7), pp.1165- 1178, 2008.
[2] C. A. R. Hoare, "An Axiomatic Basis for Computer Programming," Communications of the ACM, vol. 12, no. 10, pp. 576 - 580, 1969.
[3] R. W. Floyd, "Assigning Meanings to Programs," Mathematical Aspects of Computer Science, vol. 19, no. 1, pp. 19-32, 1967.
[4] Mili , Ali ; Tchier, Fairouz ;, Software Testing: Concepts and Operations, Hoboken, New Jersey: John Wiley & Sons, 2015.
[5] "Isabelle," [Online]. Available: http://www.cl.cam.ac.uk/research/hvg/Isabelle/.
[6] S. Owre, J. Rushby and N. Shankar, "PVS: A Prototype Verification System," in 11th International Conference on Automated Deduction (CADE), vol. 607, Springer- Verlag, 1992, pp. 748-752.
[7] "Symbolic Model Verifier," [Online]. Available: http://www.cs.cmu.edu/~modelcheck/smv.html.
[8] J. Winkler, "The Frege Program Prover FPP," in Internationales Wissenschaftliches Kolloquium, vol. 42, 1997, pp. 116-121.
[9] D. Crocker, "Perfect Developer: A Tool for Object-Oriented Formal Specification and Refinement," Tools Exhibition Notes at Formal Methods Europe, 2003.
[10] H¨ahnle , Reiner; Bubel, Richard, "A Hoare-Style Calculus with Explicit State Updates," Formal Methods in Computer Science Education(FORMED), pp. 49-60, 2008.
[11] "Hoare Advanced Homework Assistant (HAHA)," [Online]. Available: http://haha.mimuw.edu.pl/.
[12] T. Sznuk and A. Schubert, "Tool Support for Teaching Hoare Logic," in Software Engineering and Formal Methods, Springer, 2014, pp. 332-346.
[13] "Key- Hoare System," [Online]. Available: http://www.key- project.org/download/hoare/.
[14] L. de Moura and N. Bjørner, "Z3: An efficient SMT solver," in Tools and Algorithms for the Construction and Analysis of Systems, Springer, 2008, pp. 337- 340.
[15] C. Barrett, C. L. Conway, M. Deters, L. Hadarean, D. Jovanovi´c, T. King, A. Reynolds and C. Tinelli, "CVC4," in Computer Aided Verification, Springer, 2011, pp. 171-177.
[16] Feinerer, Ingo and Salzer, Gernot , A comparison of tools for teaching formal software verification, Formal Aspects of Computing, vol. 21(3), pp. 293–301, 2009.
Citation Count – 18
The Impact of Software Complexity on Cost and Quality - A Comparative Analysis Between Open Source and Proprietary Software
Anh Nguyen-Duc IDI, NTNU, Norway
ABSTRACT
Early prediction of software quality is important for better software planning and controlling. In early development phases, design complexity metrics are considered as useful indicators of software testing effort and some quality attributes. Although many studies investigate the relationship between design complexity and cost and quality, it is unclear what we have learned beyond the scope of individual studies. This paper presented a systematic review on the influence of software complexity metrics on quality attributes. We aggregated Spearman correlation coefficients from 59 different data sets from 57 primary studies by a tailored meta-analysis approach. We found that fault proneness and maintainability are most frequently investigated attributes. Chidamber & Kemerer metric suite is most frequently used but not all of them are good quality attribute indicators. Moreover, the impact of these metrics is not different in proprietary and open source projects. The result provides some implications for building quality model across project type.
KEYWORDS
Design Complexity, Software Engineering, Open source software, Systematic literature review
For More Details : http://aircconline.com/ijsea/V8N2/8217ijsea02.pdf
Volume Link : http://www.airccse.org/journal/ijsea/vol8.html
REFERENCES
[1] T. DeMarco, “A metric of estimation quality,” Proceedings of the May 16-19, 1983, national computer conference, Anaheim, California: ACM, 1983, pp. 753-756.
[2] R.B. Grady, Practical Software Metrics for Project Management and Process Improvement, Hewlett-Packard Professional Books, Prentice Hall, New Jersey,1992.
[3] C. Catal and B. Diri, “A systematic review of software fault prediction studies,” Expert Systems with Applications, vol. 36, 2009, pp. 7346-7354.
[4] O. Gomez, H. Oktaba, M. Piattini, and F. Garci¬a, “A systematic review measurement in software engineering: State-of-the-art in measures”, 1st International Conference on Software and Data Technologies (ICSOFT), 2006, pp. 224-231.
[5] E. Arisholm, L. Briand, and E. Johannessen, “A systematic and comprehensive investigation of methods to build and evaluate fault prediction models,” Journal of Systems and Software, vol. 83, 2010, pp. 2-17.
[6] C. Bellini, R. Pereira, and J. Becker, “Measurement in software engineering: From the roadmap to the crossroads,” International Journal of Software Engineering and Knowledge Engineering, vol. 18, 2008, pp. 37-64.
[7] IEEE, IEEE Standard Glossary of Software Engineering Terminology, report IEEE Std 610.12- 1990, IEEE, 1990.
[8] L. Briand, J. Wuest, S. Ikonomovski, and H. Lounis, “A Comprehensive Investigation of Quality Factors in Object-Oriented Designs: An Industrial Case Study” Technical Report ISERN-98-29, International conference on Software Engineering, 1998.
[9] K. El Emam, W. Melo, and J. Machado, “The prediction of faulty classes using object-oriented design metrics,” Journal of Systems and Software, vol. 56, 2001, pp. 63-75.
[10] B. A. Kitchenham, “Guidelines for performing Systematic Literature Reviews in Software Engineering”, Ver 2.3, Keele University, EBSE Technical Report, 2007
[11] L.M. Pickard, B.A. Kitchenham, and P.W. Jones, “Combining empirical results in software engineering,” Journal on Information and Software Technology, vol. 40, Dec. 1998, pp 811-821
[12] M. Ciolkowski, “Aggregation of Empirical Evidence,” Empirical Software Engineering Issues. Critical Assessment and Future Directions, Springer Berlin / Heidelberg, 2007, p. 20
[13] LV. Hedges, I. Olkin, Statistical Methods for Meta-analysis. Orlando, FL: Academic Press, 1995.
[14] H. Cooper, L. Hedges, “Research synthesis as a scientific enterprise”, Handbook of research synthesis (pp. 3-14). New York: Russell Sage, 1994.
[15] J. E. Hannay, T. Dybå, E. Arisholm, D. I. K. Sjøberg, “The Effectiveness of Pair- Programming: A Meta-Analysis”, Journal on Information and Software Technology 55(7):1110-1122, 2009.
[16] M. Ciolkowski, “What do we know about perspective-based reading? An approach for quantitative aggregation in software engineering”, 3rd IEEE International Symposium on Empirical Software Engineering and Measurement, 2009, pp. 133- 144.
[17] ISO, “International standard ISO/IEC 9126. Information technology: Software product evaluation: Quality characteristics and guidelines for their use.” 1991
[18] S.R. Chidamber and C.F. Kemerer, “Towards a metrics suite for object oriented design,” SIGPLAN Not., vol. 26, 1991, pp. 197-211.
[19] J. M. Scotto, W. Pedrycz, B. Russo, M. Stefanovic, and G. Succi, “Identification of defect-prone classes in telecommunication software systems using design metrics,” Information Sciences, vol. 176, 2006, pp. 3711-3734.
[20] R. Subramanyam and M. Krishnan, “Empirical analysis of CK metrics for object- oriented design complexity: Implications for software defects,” IEEE Transactions on Software Engineering, vol. 29, 2003, pp. 297-310.
[21] Y. Zhou and H. Leung, “Empirical analysis of object-oriented design metrics for predicting high and low severity faults,” IEEE Transactions on Software Engineering, vol. 32, 2006, pp. 771-789.
[22] L. Briand, W. Melo, and J. Wurst, “Assessing the applicability of fault-proneness models across object-oriented software projects,” IEEE Transactions on Software Engineering, vol. 28, 2002, pp. 706-720.
[23] G. Succi, W. Pedrycz, M. Stefanovic, and J. Miller, “Practical assessment of the models for identification of defect-prone classes in object-oriented commercial systems using design metrics,” Journal of Systems and Software, vol. 65, 2003, pp. 1-12.
[24] T. Saracevic, “Evaluation of evaluation in information retrieval”, 18th annual international ACM SIGIR conference on Research and development in information retrieval, Seattle, Washington, United States, 1995, pp. 138-146.
[25] Paulson, J.W.; Succi, G.; Eberlein, A., "An empirical study of open-source and closed-source software products”, IEEE Transactions on Software Engineering, vol.30, no.4, pp. 246- 256, April 2004.
[26] A. Bachmann and A. Bernstein, “Software process data quality and characteristics: a historical view on open and closed source projects,” Joint international and annual ERCIM workshops on Principles of software evolution (IWPSE) and software evolution (Evol) workshops, Amsterdam, The Netherlands: ACM, 2009, pp. 119- 128.
[27] J. Cohen, Statistical Power Analysis for the Behavioral Sciences (2nd Edition), 2nd ed. Routledge Academic, January 1988.
[28] J. Devore, Probability and Statistics for Engineering and the Sciences, 7th Ed. Thomson Brooks, 2008.
[29] N. D. Anh, “The impact of software design complexity on cost and quality”, Master thesis, [Available ONLINE] http://www.bth.se/fou/cuppsats.nsf/$$Search
International Journal of Software Engineering & Applications (IJSEA)
ISSN : 0975 - 9018 ( Online ); 0976-2221 ( Print )
http://www.airccse.org/journal/ijsea/ijsea.html
0 notes