45 results listed
Mobile Communication Technologies have
experienced a very rapid change in recent years. Active Queue
Management Algorithms are used to solve the problems of
blockage and packet loss in mobile networks. Queue Management
Algorithms are the most important and most important factors
that directly affect network performance. In this study, Active
Queue Management Algorithms such as RED, ARED, SRED,
REM, SBF, BLUE, RED, PURPLE, GREEN, CoDel used in
mobile networks and the development of improved versions of
these algorithms with different methods and techniques have been
shown comparatively
International Conference on Cyber Security and Computer Science
ICONCS
Muhammet Çakmak
Zafer Albayrak
In this study, an application sending location via short message(SMS) has been developed for smartphones using Android operating system while internet is disabled in case of emergency. If Global Position System(GPS) of user’s phone is disabled, the application warns the user by vibration and screen message. In emergency calls is to inform emergency call center by sending SMS or to send SMS to a predetermined number about the coordinate of caller by using A-GPS(Assisted GPS) feature.eveloped application has tested both indoors and outdoors as well as on different brands and models of Android. The error rate of outdoor tests is approximately 10-15 meters and the indoor result is approximately 15-30 meters. Transmission times of SMS are 14-32 seconds and 20-92 seconds respectively. SMS transmission time differs from region to region depending on
connection time to base station and magnetic pollution.
International Conference on Cyber Security and Computer Science
ICONCS
Abdurrahman HAZER
İbrahim ÖZER
Remzi YILDIRIM
Nowadays the computing trend is very large-scale and complex such as the Internet, banking system, online payment system, security, and surveillance system are generating a large amount of data every day. From these data, the percentage of imbalance data is quite high. These imbalanced data is misguiding a machine learning model and data mining technique. Learning from imbalanced data is a new complaint that has created increasing concentration from all over the world. This imbalanced data is creating a problem in learning problem with lots of unevenly distributed class. This paper concentrates on few realistic and appropriate data preprocessing techniques and produces an appropriate class evaluation process for the imbalanced data. An empirical distinction of few well-recognized soft computing methods such as Support Vector Machine (SVM), Decision Tree Classifier (DTC), K-Nearest Neighbor (KNN) and Gaussian Naïve Bayes (GNB) are used to find Accuracy, Precision, Recall and FMeasure from an imbalanced dataset. The imbalanced data were trained after a well-known over-sampling technique named Synthetic Minority Over-sampling Technique (SMOTE), under-sampling using Cluster Centroids (CC) technique and then applied a hybrid technique named SMOTEENN which is the combination of SMOTE and Edited Nearest Neighbor (ENN). Accuracy, Precision, Recall, FMeasure and Confusion matrix are used to evaluate the performance. In this task exhibit an experimental distinction of few well-recognized classification algorithms and performance measure that is authentic for the imbalanced dataset, this results we achieved. The result shows that hybrid method redacts better than Oversampling and under-sampling techniques.
International Conference on Cyber Security and Computer Science
ICONCS
Md. Anwar Hossen
Fatema Siddika
Tonmoy Kumar Chanda
T. Bhuiyan
The technology is growing rapidly and cloud computing usage is increasing. Most of the big and small companies use the cloud nowadays. Cloud computing has the economic benefit which is paid as you use (i.e. pay on the demand). With the increase of cloud usage, security problems on the cloud also increasing. Some mechanisms like firewall, vulnerability scanners and Intrusion Detection System (IDS) and other methods are used to mitigate the intrusions, but they are not enough to detect attacks against the cloud due to new intrusion releases. There are a variety of security methods for improving cloud security from threats and vulnerabilities. In this paper, a new hybrid cloud-based IDRS based on Grey wolf optimizer (GWO) and Neural Network (NN) is proposed to secure and detect intrusions over the cloud. GWO is one of the effective metaheuristic algorithms in many fields such as security. In this paper, GWO is employed to train an NN and the results are compared with other classification algorithms. For experimental results, most up-to-date intrusion
International Conference on Cyber Security and Computer Science
ICONCS
İsmai M. Nur
Erkan Ülker
Due to improving technology and spreading internet
the entire world, people adapted using it in an extensive manner.
Our critical private data are encountering threads which are
coming outside of the computer systems and network
environments. In other word, intruders access folk’s information
without authentication and unauthorized mode. To overcome
such kind of security vulnerability matter, a lot of scientific
researchers have attracted their awareness the use of this new
model called hybrid intrusion detection systems, which is an
integration of two or more algorithms, then one of the algorithms
is utilized as input while the functionality of other one is tasking
or classifying. The new model has a very powerful and plays a
significant role in cybersecurity. In recent years, the combination
of machine learning methods with metaheuristic algorithms is
hybridized to obtain an optimum solution. In this study, we
present a new model using the Sine Cosine Algorithm for feature
selection and the Naïve Bayes Classifier (NBC) algorithm for
classification. Our main goal is to find a model that emphasizes a
good performance for detecting and finding preferable accuracy.
We compare our experimental results with other algorithms such
as KNN, Decision Tree classifications and etc., to realize which
one is performed an excellent in terms of accuracy and detection
rate. İn addition to this, Sine Cosine Algorithm will be contrast to
Particle swarm optimization (PSO), not only PSO but also genetic
algorithm (GA) in terms of feature reduction and selection the
quality ones, various datasets such as NSL-KDD, ISCX 2012 and
etc., has been applied on the new presented method to examine its
performance. Finally, the introduced method will prove that
whether it has better performance and superior accuracy
compared to the other algorithms.
International Conference on Cyber Security and Computer Science
ICONCS
SALAAD MOHAMED SALAAD
Erkan Ülker
Automatic opacity determination of the voxels in a 3D image is important for getting the right interpretation of the image. In this study, we propose a method based on Particle Swarm Optimization (PSO) algorithm which can be used for finding the opacity values of opacity transfer function. The method requires information about region of interest (ROI) in the image. The performance of the proposed method is analyzed on the phantom images having nested spheres and the results is presented visually.
International Conference on Cyber Security and Computer Science
ICONCS
Ç. KILIKÇIER
E. YILMAZ
A new encryption/decryption algorithm has been developed by using a new chaotic circuit, namely modified Chua’s circuit (MCC). The importance of MCC is that it exhibits hyperchaotic behavior for a large parameter regime due to the double frequency dependent nature. The numbers extracted from the solutions of the MCC are transmitted to the new developed algorithm for the encryption and the decryption aims. The scrambling feature, which is implemented at the bit level using the MCC has been applied in the algorithm. Following the encryption procedure, the encrypted colored image has been tested by a variety of tests including the secret key size and secret key sensitivity analysis, histogram analysis, correlation analysis, differential analysis, and information entropy analysis. The results are good and provide an efficient technique for the color image encryption and decryption in the theme of secure communication.
International Conference on Cyber Security and Computer Science
ICONCS
B.ARPACI
E.KURT
K.ÇELİK
This paper addresses the challenges of load forecasting
that occur due to the complex nature of load in different
predicting horizons and as well as the total consumption within
these horizons. It is not often easy to accurately fit the several
complex factors that are faced with demand for electricity into
the predicting models. More so, due to the dynamic nature of
these complex factors (i.e., temperature, humidity and other
factors that influence consumption), it is difficult to derive an
accurate demand forecast based on these parameters. As a
consequence, a model that uses hourly electricity loads and
temperature data to forecast the next hourly loads is proposed.
The model is based on modified entropy mutual information
based feature selection to remove irrelevancy and redundancy
from the dataset. Conditional restricted Boltzmann machine
(CRBM) is investigated to perform load forecasting; accuracy
and convergence are improved to reduce the CRBM’s forecast
error via a Jaya based meta-heuristic optimization algorithm.
The proposed model is implemented on the publicly available
dataset of GEFCom2012 of the US utility. Comparative analysis
is carried out on an existing accurate, fast converging shortterm
load forecasting (AFC-STLF) model since it has a similar
architecture to the proposed model. Simulation results confirm
that the proposed model improves the accuracy up to 56.32% as
compared to 43.67% of AFC-STLF. Besides, the proposed model
reduces the average execution time up to 53.87% as compared
to 46.12% of AFC-STLF.
International Conference on Cyber Security and Computer Science
ICONCS
Omaji Samuel
Nadeem Javaid
Asma Rafique
This article deals with the development of a
mathematical model, appropriate computational algorithms and a
set of application programs in a high-level object-oriented
programming language, which allow us to numerically simulate
and study the thermo mechanical state of rods, while having local
thermal insulation, heat exchange, temperature and axial forces,
taking into account pinching the ends of the rod.
International Conference on Cyber Security and Computer Science
ICONCS
Kanat Amirtaev
The popularity of web applications is growing faster due to fulfil the requirements of the business and satisfy the needs of consumers. Web applications are now being capable in providing business services to its stakeholders in the most effective and efficient manner. In this modern time, several number of services are providing through web applications and performance of those are measured through the services processing time and the informative functionalities. However, those services, at the same time, can be faced by a threat due to improper validation. Currently, cyber-attacks become a critical risk for every digital transformation throughout the world. Careless coding practice during the development and lack of knowledge about security are the root cause of different types of application layer vulnerability remains in the web system. Remote Code Execution (RCE) is one of the serious vulnerability at this era. According to Web Application Security project (CWE/SANS), RCE has been listed as 2nd ranked critical web application Vulnerability since 2016. Insignificant research works on RCE have been found during the literature review. This paper presents a complete case study on RCE vulnerability.
International Conference on Cyber Security and Computer Science
ICONCS
S. Biswas
M. M. H. K. Sajal
T. Afrin
T. Bhuiyan
M. M. Hassan
The Student Model is dedicated to personalize and to adapt the learning. With pedagogical strategy self-switching, the monitoring of the student model is the cornerstone of pedagogical strategy adapting. To efficiently achieve the monitoring operation, we propose a fine grained WildCAT based Observable Bayesian Student Model. On one side, it represents how the user relates to the concepts of the knowledge structure using the pedagogical component. On the other side, it integrates concept level sensors that results in an Observable Networks’ Sensors. This permits to ensure the collect of the instant student knowledge level. In addition, it uses a publish/subscribe communication model to notify the Student Cognitive changes to the monitoring component. On this side, the Monitoring Component subscribe as a receiver of appropriate cognitive changes. To experiment the likelihood and the usefulness of this model, a framework is constructed using WildCAT on a Student Cognitive Level.
International Conference on Cyber Security and Computer Science
ICONCS
S. BOULEHOUACHE
Selma Ouareth
Ramdane Maamri
This paper contains analysis of main modern
approaches to dynamic code generation, in particular generation
of new classes of objects during program execution. The main
attention was paid to universal exploiters of homogeneous classes
of objects, which were proposed as a part of such knowledgerepresentation
model
as
object-oriented
dynamic
networks,
as
the
tools
for
generation
of
new
classes
of
objects
in
program
runtime.
As
the result, algorithms for implementation of such universal
exploiters of classes of objects as union, intersection, difference
and symmetric difference were developed. These algorithms can
be used knowledge-based intelligent systems, which are based on
object-oriented dynamic networks, and they can be adapted for
some object-oriented programming languages with powerful
metaprogramming opportunities.
International Conference on Cyber Security and Computer Science
ICONCS
D. O. Terletskyi
DRDoS is the new method of choice for denial of
service attacks: Certain services running over UDP is chosen for
the attack. Servers across the Internet are contacted by bots with the
spoofed IP address of the victim host. In response, huge amounts of
response data created by the servers are sent to the victim,
temporarily disabling it. The most commonly exploited protocols are
those that yield the highest "amplification factor", including NTP,
DNS, and Memcached. Mitigation of these attacks can be done simply by hardening
servers against known vulnerabilities. However, in practice, there
are many servers that lag behind. In this study, we carried out a
regional analysis of NTP, DNS, and Memcached servers in Europe,
and assessed their readiness against being used as amplifiers in
DRDoS attacks.
International Conference on Cyber Security and Computer Science
ICONCS
Emre Murat ERCAN
Ali Aydın SELÇUK
Internet of Things (IoT), a technology in which various physical devices are interconnected with each other using a conglomeration of technologies, is one of the fastest growing sectors. This ever-increasing demand for IoT devices are satisfied by products from many different companies with varying qualities and more importantly, varying principles regarding security. The fact that unified security protocols and approaches are lacking between the manufacturers and no significant regulations or legislation concerning IoT exist in a national and international level, creates a significant security risk. Moreover, the well-known security solutions are often incompatible with IoT devices mainly because of the power and computational constraints of IoT devices. This work aims to identify the current security risks concerning IoT and present some of the solutions that address these risks. The physical, regulational and social challenges stemming from IoT security solutions will be analyzed, and future directions will be explored.
International Conference on Cyber Security and Computer Science
ICONCS
DORUK PANCAROGLU
SEVIL SEN
Home energy management systems (HEMSs) based on demand response (DR) synergized with renewable energy sources (RESs) and energy storage systems (ESSs) optimal dispatch (DRSREOD) are used to implement demand-side management in homes. Such HEMSs benefit the consumer and the utility by reducing energy bills, reducing peak demands, achieving overall energy savings and enabling the sale of surplus energy. Further, a drastically rising demand of electricity has forced a number of utilities in developing countries to impose large-scale load sheddings (LSDs). A HEMS based on DRSREOD integrated with an LSD-compensating dispatchable generator (LDG) (DRSREODLDG) ensures an uninterrupted supply of power for the consumers subjected to LSD. The LDG operation to compensate the interrupted supply of power during the LSD hours; however, accompanies the release of GHGs emissions as well that need to be minimized to conserve the environment. A 3-step simulation based posteriori method is proposed to develop a scheme for eco-efficient operation of DRSREODLDG-based HEMS. The method provides the tradeoffs between the net cost of energy (CEnet) to be paid by the consumer, the time-based discomfort (TBD) due to shifting of home appliances (HAs) to participate in the HEMS operation and minimal emissions (TEMiss) from the local LDG. The search has been driven through multi-objective genetic algorithm and Pareto based optimization. The surface fit is developed using polynomial models for regression based on the least sum of squared errors and selected solutions are classified for critical tradeoff analysis to enable the consumer by choosing the best option and consulting a diverse set of eco-efficient tradeoffs between CEnet, TBD and TEMiss.
International Conference on Cyber Security and Computer Science
ICONCS
Bilal Hussain
Nadeem Javaid
Qadeer-ul Hasan
Yüksel Çelik
Asma Rafique
Home energy management systems (HEMSs) based
on demand response (DR) synergized with renewable energy
sources (RESs) and energy storage systems (ESSs) optimal
dispatch (DRSREOD) are used to implement demand-side management
in homes. Such HEMSs benefit the consumer and
the utility by reducing energy bills, reducing peak demands,
achieving overall energy savings and enabling the sale of surplus
energy. Further, a drastically rising demand of electricity has
forced a number of utilities in developing countries to impose
large-scale load sheddings (LSDs). A HEMS based on DRSREOD
integrated with an LSD-compensating dispatchable generator
(LDG) (DRSREODLDG) ensures an uninterrupted supply of
power for the consumers subjected to LSD. The LDG operation
to compensate the interrupted supply of power during the LSD
hours; however, accompanies the release of GHGs emissions as
well that need to be minimized to conserve the environment. A
3-step simulation based posteriori method is proposed to develop
a scheme for eco-efficient operation of DRSREODLDG-based
HEMS. The method provides the tradeoffs between the net cost
of energy (CEnet) to be paid by the consumer, the time-based
discomfort (TBD) due to shifting of home appliances (HAs)
to participate in the HEMS operation and minimal emissions
(TEMiss) from the local LDG. The search has been driven
through multi-objective genetic algorithm and Pareto based
optimization. The surface fit is developed using polynomial
models for regression based on the least sum of squared errors
and selected solutions are classified for critical tradeoff analysis to
enable the consumer by choosing the best option and consulting
a diverse set of eco-efficient tradeoffs between CEnet, TBD and
TEMiss.
International Conference on Cyber Security and Computer Science
ICONCS
Bilal Hussain
Nadeem Javaid
Qadeer-ul Hasan
Yüksel Çelik
Asma Rafique
Technologies are constantly being developed and commercialized in the current era of the digital world. Wearable device is one of the most rapid growing devices in information technology in developing countries. Drawing upon Unified Theory of Acceptance and Understanding of Technology2 (UTAUT2), this paper examines the use behavior of wearable devices. Data was collected from 150 smart watch users from Bangladesh using survey questionnaire. Result indicates that the performance expectancy, hedonic motivation and habit playing a positive influential role in the terms of adaptation of wearable devices. Our study showed that three independent variables affect the behavior intention of wearable devices which is performance expectancy, hedonic motivation and habit. In other side, Behavioral Intention of using wearable device among the people of Bangladesh influenced by Habit. Our proposed model is empirically tested and contributed to an emerging body of technology acceptance and can be motivating the users of wearable devices. This research shades light to the industry by identifying factors that could affect consumers of wearable devices and could be a diagnostic tool for the industry to penetrate the market of wearable devices.
International Conference on Cyber Security and Computer Science
ICONCS
Sharmin Akter
Dr. Imran Mahmud
Md. Fahad Bin Zamal
Jannatul Ferdush
Machine learning algorithms have configurable parameters. Known as hyperparameters, they are generally used with their default settings. However, in order to increase the success of a machine learning algorithm, it is required to develop sophisticated techniques to tune hyperparameters. Tuning a machine learning algorithms need great effort. However, existing methods can only be performed via discrete programming tools. In this paper, a user-friendly hyperparameter tuning tool is proposed for ensemble learning. It encompasses selecting tuning algorithm, data set, and performance visualization. Besides them, developed tool is compatible with executing R codes to conduct big data experiments.
International Conference on Cyber Security and Computer Science
ICONCS
Muhammed Maruf ÖZTÜRK
The Semantic Web is a web environment that allows
well defined information and services to be easily understood by
machines. The main component of the Semantic Web is
ontologies, which formally define a set of concepts for a domain
and the relationships between concepts. One of the areas where
ontologies can be used is the field of healthcare. In particular, the
use of ontologies in the field of healthcare is recommended
because of the formal representation of a subject area and its
support for reusability. Many medical classification systems are
used in the field of medical informatics. The deficiency seen in the
proposed approaches for health information systems is that there
is no meaningful reference and sharing system that enables the
collection of classification and coding systems. Considering the
general classification and coding systems, it is clear that a system
is required for faster, accurate and efficient processing. In
accordance with the interoperability needs of health information
systems that are constituted by different ontology combinations,
this study propose ontology based of Systemized Nomenclature of
Medical-Clinical Terms (SNOMED CT) Concept Model. This
model remarks reasonable definition of concepts in SNOMED
CT. The official semantics of ontology enhance the ability to
automate information management of complex terminology,
facilitate the maintenance of clinical decision support materials,
and significantly improve interoperability.
International Conference on Cyber Security and Computer Science
ICONCS
Yasemin Gültepe
Distributed Denial of Service (DDoS) attacks are serious
threat to any online service on the internet. In contrast to other
traditional threats, DDoS HTTP GET flood attack can exploit
legitimate HTTP request mechanism to effectively deny any
online service by flooding the victim with an overwhelming
amount of unused network traffic. This paper introduces a new
anomaly-based technique for discriminating DDoS HTTP GET
requests and legitimate requests using a combination of
behavioral features. The key features are Diversity of the
requested objects, requesting rates for all the requested objects,
and request rate for the requested object with the most
frequency. These features are selected as the key measurements
that will be analyzed and processed for developing the proposed
detection technique. During the evaluation process, sub set of
the UNB ISCX IDS 2012 evaluation dataset representing
anomalous traffic, in addition to another sub set extracted from
the 98 world cup dataset showing legitimate traffic are used to
evaluate the proposed method. The evaluation shows that the
proposed mechanism does effective detection due to the subtle
behavioral dissimilarity between non-recursive attack and
legitimate requests traffic.
International Conference on Cyber Security and Computer Science
ICONCS
Mohammed SALIM
Seçkin ARI
Recently big data analytics are gaining popularity in
the energy management systems (EMS). The EMS are responsible
for controlling, optimization and managing the energy market
operations. Energy consumption forecasting plays a key role
in EMS and helps in generation planning, management and
energy conversation. A large amount of data is being collected
by the smart meters on daily basis. Big data analytics can
help in achieving insights for smart energy management. Several
prediction methods are proposed for energy consumption
forecasting. This study explores the state-of-the-art forecasting
methods. The studied forecasting methods are classified into
two major categories: (i) univariate (time series) forecasting
models and (ii) multivariate forecasting models. The strengths
and limitations of studied methods are discussed. Comparative
anlysis of these methods is also done in this survey. Furthermore,
the forecasting techniques are reviewed from the aspects of big
data and conventional data. Based on this survey, the gaps in
the existing research are identified and future directions are
described.
International Conference on Cyber Security and Computer Science
ICONCS
Sana Mujeeb
Nadeem Javaid
Sakeena Javaid
Asma Rafique
Manzoor Ilahi
In recent years, the growing and large amounts of
data, which have been associated with the widespread use of
social media, smart devices and internet, define big data. With big
data; The vast majority of things that were formerly never
measured, stored, analyzed, or shared, were converted into
processed and usable data. The big data typically describes both
the type of managed data and the technology used to collect and
operate it. Data can be transformed into information that can
only have a value, but if without the wisdom, information can be
allowed to really useful to people. Nowadays the big data attract
attention with such qualities for his volume, speed and variety.
With the increased use of big data, a major breakthrough in
productivity, profitability and innovation in different sectors is
expected. Examples are many successful applications of big data
in different areas of the world; Public sector, health, insurance,
banking, education, etc. Big data can help improve productivity,
profitability, performance and reduce data exhaustion, etc.
Education, health, banking, retail sales, government resources,
defense industry, production and energy sectors, as well as
facilitating human life will increase the efficiency of institutions
and will constitute the infrastructure of further progress towards
the future. In the study, big data was handled conceptually,
relations with many concepts, big data technologies and methods
used for big data processing were introduced and different
examples were given about usage areas of big data in the world.
International Conference on Cyber Security and Computer Science
ICONCS
Yasemin Gültepe
oday, almost everything is done through networks.
Especially, Networks are widely used for transportation of data.
Various methods are used to move the data from one place to
another. One of these methods is Optical Burst Switching (OBS).
When carrying data in OBS, some of the threats may be
encountered as a result of security shortcomings. Some of these
threats are Spoofing, Replay Attack, Circulating Burst Header
Attack and Burst Header Packet (BHP) Flooding Attack.
Detection of threats is difficult but it is very important to our
safety. Therefore, using Machine Learning (ML) methods to
detect threats will give us flexibility, time and accuracy. In this
study, we will classify BHP Flooding Attack data that have four
class labels with ML methods. Our class labels are as follows:
Misbehaving-Block (Block), Behaving-No Block (No Block),
Misbehaving-No Block (NB-No Block), and Misbehaving-Wait
(NB-Wait). Methods used in classification are Decision Tree
(J48), Logistic, Multilayer Perceptron (MLP), Random Tree
(RT), Reduce Error Pruning (REP) Tree and Naive Bayes (NB).
Since there are 22 properties in the data set, the results of feature
selection are also examined using the same classification methods.
As a result, J48 and RT have been found to achieve the best
results with 100% accuracy.
International Conference on Cyber Security and Computer Science
ICONCS
V.N. UZEL
E. SARAÇ EŞSİZ
Many studies have been done to draw attention to
violence against women around the world. The aim of the studies
is to awaken the society in general and to encourage women. For
this purpose, this paper is aimed to draw attention to the violence
against women by using data mining classification algorithms.
The purpose of this study is to analyze the data on Twitter, which
is one of the most widely used social media platform, and how
much of the words such as violence, women and harassment are
related to violence against women as cybercrime. For this, tweets
from Twitter should be taken with certain words. Tweets were
obtained according to some attributes using the Python language
and the streaming API. The Tweepy library also used for this
streaming API. Tweets were taken and analyzed in WEKA tool
using various data mining classification algorithms. According to
the experimental results, the best classifier was J48 algorithm
with 82.9% accuracy and 0.902 F-Measure value.
International Conference on Cyber Security and Computer Science
ICONCS
M. KAYA KELEŞ
A. Ş. EROL
Twitter is one of the most widely used social networks
today. Because of its wide usage, it is also the target of various
spam attacks. In recent years, Spam Detection on Twitter using
artificial intelligence methods became quite popular. Twitter
Spam Detection Approaches are generally categorized into
following types as as User Based, Content Based, Social Network
Based Spam Detection. In this paper, a user based features based
spam detection approach is proposed. Using a publicly available
recent baseline dataset, 11 lightweight user based features are
selected for model creation. These features selected for ease of
computing and rapid processing since they are numeric or
boolean. The advantage of user based spam detection approach is
that the results are obtained more rapidly since they do not
contain complex features. Selected Features are verified, default
profile, default profile image, favorites count, followers count,
friends count, statuses count, geo enabled, listed count, profile
background tile, profile use background image. Feature verified is
used as a class label to measure success of the model. After the
feature selection, the dataset is divided into test and training data.
Following 10 common supervised machine learning algorithms
are selected for the experiments: (1) Support Vector
Classification, (2) K Nearest Neighbor, (3) Naive Bayes, (4)
Decision Tree, (5) Bagging, (6) Random Forest, (7) Extra Trees,
(8) AdaBoost, (9) Multi Layer Perceptron, and (10) Logistic
Regression. Success of the algorithms are measured using
following 9 metrics: (1) Accuracy, (2) precision, (3) recall, (4)
True Positive, (5) True Negative, (6) False Positive, (7) False
Negative, (8) Training Time, (9) Testing Time. The results were
compared according to the metrics above.
International Conference on Cyber Security and Computer Science
ICONCS
Anıl Düzgün
Fecir Duran
Atilla Özgür
Mobile payment services are the newest and most
popular technology that is developing according to our habits and
needs. Consumer all over the world are using mobile phone for
payment as well as communication. The main purpose of using
mobile payment application is doing all transaction easily and
quickly. Not only data security in electronic transactions, but also
the speed of the system operations is becoming very important.
There is a threshold value to finish all transaction in mobile
payment systems. If the security algorithm is more complex and
exceed threshold, it is not suitable to using in mobile payment
systems. In this paper we compare cryptography algorithms and
proposed two algorithms on Advanced Encryption Standards. The
experiment results show that proposed algorithms is suitable
cryptography algorithm for mobile system according to time and
storage consumption factors.
International Conference on Cyber Security and Computer Science
ICONCS
Ö. ŞENGEL
M. A. AYDIN
A. SERTBAŞ
In this paper, we have defined a new discrete logarithm problem to be used composite modules discrete logarithm problem which is only used prime modules and we have constructed a new ElGamal cryptosystem based on the a new discrete logarithm problem. We have called the new system as Composite ElGamal cryptosystem. Then we made an application of Composite ElGamal cryptosystem to asymmetric cryptography and finally we have compared that Composite ElGamal cryptosystem and ElGamal cryptosystem in terms of cryptography and we have obtained that Composite ElGamal cryptosystem is more advantageous than ElGamal cryptosystem.
International Conference on Cyber Security and Computer Science
ICONCS
C. ÖZYILMAZ
Ayşe Nallı
In this work, cryptography has been developed to
ensure that confidential information is communicated securely.
As a method, a randomly generated phase mask and a grey level
picture made entirely of noise is used. The information that is
corrupted in phase is placed in this noisy image according to a
predetermined algorithm. First of all, the image is closed with a
randomly generated phase mask and then the pixel values of the
image whose phase value is completely corrupted are scattered
into the carrier by sliding along with certain mathematical
operations. In order to recover the encrypted image and
information, carrier and randomly generated phase keys are
used respectively. It has been tested that the reliability of the
algorithm developed with two keys and robustness of the
algorithm to noise attacks. In addition, the reliability of the
developed algorithm is also tested with techniques such as
correlation, histogram and contrast stretching.
International Conference on Cyber Security and Computer Science
ICONCS
Abdurrahman HAZER
İbrahim ÖZER
Remzi YILDIRIM
Nowadays, rightly so, the concept of cyber security
is very important. The most effective weapon in this area is
undoubtedly malicious software. Therefore, it is more important
to analyze malware effectively and to prevent possible harms.
One of the techniques to analyze the malware is sandboxing.
There are too many sandbox options in the wild that can be
preferred depending on situations and the service provided. In
this paper, the differences between free open source and
commercial sandboxes have been discussed. There have been
several advantages and disadvantages between them that is
mentioned in the result.
International Conference on Cyber Security and Computer Science
ICONCS
G.Kale
Erkan Bostanci
F.V. ÇELEBİ
Smart grid technologies ensures reliability, availability
and efficiency of energy which contribute in economic
and environmental benefits. On other hand, communities have
smart homes with private energy backups however, unification
of these backups can beneficial for the community. A community
consists of certain number of smart homes (SH) which have
their own battery based energy storage system. In this paper,
12 smart communities are connected with 12 fog computing
environment for power economy sharing within the community.
Each community has 10 smart homes with battery bases energy
storage system. These communities are evaluated for load and
cost profiles with three scenarios; SHs without storage system,
SHs with storage system for individual SH requirements and SHs
with unified energy storage system (unified-ESS). Unified-ESS is
formed with the help of home and fog based agents. Simulations
show that, unfied-ESS is efficient to have reduced cost for SHs
within the community.
International Conference on Cyber Security and Computer Science
ICONCS
Rasool Bukhsh
Nadeem Javaid
Asma Rafique
The basis of biometric authentication is that each
person's physical and behavioral characteristics can be accurately
defined. Many authentication techniques were developed for
years. Human gait recognition is one of these techniques. This
article was studied on HugaDB database which is a human gait
data collection for analysis and activity recognition (2017,
Chereshnev and Kertesz-Farkas). Combined activity data of
different people were collected in HugaDB database (2017,
Chereshnev and Kertesz-Farkas). The activities are walking,
running, sitting and standing (2017, Chereshnev and KerteszFarkas).
The data were collected with devices such as wearable
accelerometer and gyroscope (2017, Chereshnev and KerteszFarkas).
Only the walking dataset of the HugaDB was used
artificial neural network-based method for real-time gait analysis
with the minimal number of Inertial Measurement Units (2018,
Sun et al). In this paper, each person is considered as a different
class because there are multiple users' gait data in the database
and some machine learning algorithms have been applied to
walking, running, standing and sitting data. The best algorithms
are chosen from the algorithms applied to the HugaDB data and
the results are shared.
International Conference on Cyber Security and Computer Science
ICONCS
Aybüke KEÇECİ
Armağan YILDIRAK
Kaan ÖZYAZICI
Gülşen AYLUÇTARHAN
Onur AĞBULUT
İbrahim ZİNCİR
Forensic computing is a new branch of science created to facilitate the decision of the investigator in the case by examining the evidence obtained in the information systems. In forensic computing, all devices in hand have a data summary value (hash). Hash is a numerical value and unique given by the investigator so that evidence can be considered as evidence in the court by preventing the integrity of the evidence. In accordance with Article 134 of the Criminal Procedure Law (CMK), this numerical value, which is unique to the evidence, should not be
altered in any way. If it changes, the evidence in question is no longer evidence, and it will not be taken into consideration by the investigating authority even if it gives a clue about the suspect.
In this study, we expressed hash applications on forensic and the calculation methods of this numerical value which has great importance while judgement.
International Conference on Cyber Security and Computer Science
ICONCS
Muhammet Tahir GÜNEŞER
Hacı Hasan OKUYUCU
Authenticated encryption is a special form of
cryptographic system providing two main services at the same
time with a single key: confidentiality and authentication. In 2013,
ICRC called authenticated encryption candidates to the CAESAR
competition to define a widespread adaptable authenticated
encryption algorithm having advantages over AES-GCM. In this
study, to analyze competing algorithms, we constitute an extensive
metric set by reviewing previous studies and candidate cipher
reports. We constitute a metric set composed of all structural
metrics mentioned in previous studies. Then, we develop a grading
policy for each metric and evaluate ciphers’ performance and
security. Improvable parts of cipher structures are deduced and
listed. Finally, possible future work suggestions are listed to extend
metric list and to design better cipher structures.
International Conference on Cyber Security and Computer Science
ICONCS
S.E. ULUSOY
Orhun Kara
M.Önder EFE
GOST(Gosudartsvennyi Standart) Algorithms are state security algorithms developed by Russian Federation (formerly Soviet Union). The first of these algorithms is GOST 28147-89 encryption and decryption algorithm developed in 1987. Other algorithms are GOST 34.11-94 hash function algorithm and GOST 34.10-2001 digital signature algorithm. GOST 28147- 89 encryption algorithm is a 64-bit block cipher main algorithm and this main algorithm is used in GOST 34.11-94 hash function algorithm and GOST 34.10-2001 digital signature algorithm. Since the computational time of encryption algorithms are very high, to make a real time and fast encryption algorithms FPGAs(Field Programmable Gate Arrays) are the best platforms to implement these algorithms except ASICs(Application-Specific Integrated Circuits). In this study GOST 28147-89 encryption and decryption algorithm will be implemented with verilog and the algorithm speed will be tested for real time applications.
International Conference on Cyber Security and Computer Science
ICONCS
H. AKTAŞ
Nowadays risks related to information security are increasing each passing day. Both public enterprises and private sector are working on information security to provide information security. It is inevitable that the institutions must use the most appropriate methodology and tools for their own needs and legal responsibilities to provide information security. Particularly Personal Data Protection Law, the legal regulations and the development of cybersecurity risks oblige the public institutions and enterprises to establish information security management systems. In this study, methodology and tools covered under the Risk Management / Risk Assessment methodology and tools within the European Union Agency For Network and Information Security (ENISA)’s Threat and Risk Management studies are investigated. In the study, the seventeen methods and thirty one tools which are studied by ENISA on the inventory work are introduced on the basic level. The methods and tools are compared among themselves in different aspects such as the type of risk classification, the reference level, the definition of applicability, the lifecycle, the usage of them licensed.
International Conference on Cyber Security and Computer Science
ICONCS
N.YALÇIN
B.KILIÇ
Performance optimization algorithm, the Optimal
Foraging Algorithm (OFA) method to test Constrained
Optimization problems for thirteen test functions from (g01) to
(g13) to 30 runs then calculates the results and discussion of the
comparison results between these problems. The OFA algorithm
tested before for unconstrained optimization problems which it
shows the perfect performance to solve these problems. In this
research applying OFA to solve Constrained problems and
compare the performance of this algorithm with another
optimization algorithm to assess how to working
International Conference on Cyber Security and Computer Science
ICONCS
Yüksel Çelik
SAEEDA
In this paper, Jaya algorithm is used for finding an optimal unit sizing of renewable energy resources (RERs) components, including photovoltaic (PV) panels, wind turbines (WTs) and fuel cell (FC) with an objective to reduce the consumer total annual cost in a stand-alone system. The system reliability is considered using the maximum allowable loss of power supply probability (LPSP ) provided by the consumer. The methodology is applied to real solar irradiation and wind speed data taken for Hawksbay, Pakistan. The results achieved
show that when LPSP max values are set to 0% and 2%, the PVFC is the most cost-effective system as compared to PV-WT-FC and WT-FC systems.
International Conference on Cyber Security and Computer Science
ICONCS
Asif Khan
In this paper, the performance of two ultra-high frequency for outdoor performance of Multiple Input Multiple Output (MIMO) systems is analyzed. A simulation with realistic scenarios is set up using the Statistical Spatial Channel Model (SSCM) over frequency selective Rayleigh fading channel. The channel behavior of MIMO for two Ultra high frequencies (28 and 73 GHz) using different number of transmitter and receiver antenna, different receiver distance and different propagation scenario is investigated. Parameters are calculated and compared to investigate the MIMO performance for these frequencies such as path loss and received power. The effect of using different antenna is also investigated. The results are analyzed, and a conclusion was drawn about the main characteristics and the usability of these ultra-high frequencies. The investigated frequencies are candidate to become a key component for cellular 5G networks and thus it is vital to investigate them to assist engineers in designing their 5G network.
International Conference on Cyber Security and Computer Science
ICONCS
Alauddin Al-Omary
Broken Access Control (BAC), ranked as 5th crucial
vulnerability in Open Web Application Security Project
(OWASP), appear to be critical in web applications because of its
adverse consequence i.e. privilege escalation that may lead to
huge financial loss and reputation damage of the company. The
intruder of a web system can get an unauthorized access or
upgraded access level by exploiting through the BAC
vulnerability due to inadequate validation of user credential,
misconfiguration of sensitive data disclosure, inappropriate use of
functions in the code, unmanaged exception handling,
uncontrolled redirection of webpage, etc. This paper presents the
awareness regarding the risk for the existence of BAC
vulnerability in the web application to its designer, developer,
administrator, and web owner considering the facts and findings
of the document before hosting the application on live. The
experiment was conducted on 330 web applications using manual
penetration testing method following double blind testing strategy
where 39.09% of the sites were found vulnerable with the same.
Access on redirection settings, misconfiguration of sensitive data
retrieval, and unauthorized cookie access exploitation techniques
performed on the sample sites among five sectors analyzed based
on the reason of BAC, platform, domain, and operating system.
Binary logistic regression, Pearson’s χ2- value, odd ratios and pvalue
tests were performed for analyzing correlations among
factors of BAC. This examination also revealed that ignoring
session misconfiguration and improper input validation problems
are the critical factors for creating BAC vulnerability in
application.
International Conference on Cyber Security and Computer Science
ICONCS
M. M. Hassan
M. A. Ali
T. Bhuiyan
M. H. Sharif
S. Biswas
In the modern day world and with growing technology,
load forecasting is taken as the significant concerns
in the power systems and energy management. The better
precision of load forecasting minimizes the operational costs and
enhances the scheduling of the power system. The literature
has proposed different techniques for demand load forecasting
like neural networks, fuzzy methods, Na
¨
ıve Bayes and regression
based techniques. This paper proposes a novel knowledge based
system for short-term load forecasting. The proposed system has
minimum operational time as compared to other techniques used
in the paper. Moreover, the precision of the proposed model is
improved by a different priority index to select similar days.
The similarity in climate and date proximity are considered
all together in this index. Furthermore, the whole system is
distributed in sub-systems (regions) to measure the consequences
of temperature. Besides, the predicted load of the entire system
is evaluated by the combination of all predicted outcomes from
all regions. The paper employs the proposed knowledge based
system on real time data. The proposed model is compared with
Deep Belief Network and Fuzzy Local Linear Model Tree in
terms of accuracy and operational cost. In addition, the proposed
system outperforms other techniques used in the paper and also
decreases the Mean Absolute Percentage Error (MAPE) on yearly
basis. Furthermore, the proposed knowledge based system gives
more efficient outcomes for demand load forecasting.
International Conference on Cyber Security and Computer Science
ICONCS
Mahnoor Khan
Nadeem Javaid
Yüksel Çelik
Asma Rafique
The main goal of this research work is to describe
automation process of forming a relational structured database in
the Hadoop ecosystem environment. Selection a source in the
Internet environment and extracting information online, choosing
an import tool, studying unstructured data in Hadoop are
described. The use of tools (systems, utilities) such as MongoDB,
Hadoop in this research work allows combining operational and
analytical technologies.
International Conference on Cyber Security and Computer Science
ICONCS
N. SAPARKHOJAYEV
A. MUKASHEVA
P. SAPARKHOJAYEV
The present study aimed to investigate the interaction
of infants, who are emphasized to refrain from the use of
technological materials, with the technology. In the study, the
views of the parents on the significance of the television, phone
and tablet computers in infant’s life, the conditions under which
the infants interact with these communication technologies and
their behavior during these interactions were investigated. The
study group included 14 infants and their mothers reached using
convenience sampling method. In the present qualitative study,
data were collected with a semi-structured interview and video
recording methods. Thus, it was determined that the interaction
between the infants and the television started on the 5th month. It
was observed that the infants interacted with their smart phones
and tablet computers daily and their mothers utilized interesting
features of technological instruments to attract the attention of
infants, especially during feeding, and to amuse the infants while
they were busy with another task. It was concluded that mothers
did not have adequate knowledge on the advantages and
disadvantages of technological material for the infant. It was
determined that infants demonstrated great interest in
technological material due to the influence of their visual and
auditory attractiveness, they particularly perceived smart phones
as a toy and mimic adults in using smart phones.
International Conference on Cyber Security and Computer Science
ICONCS
Arzu Özyürek
Rüveyda TAŞKAYA
Aslıhan BOZ
Gizem Güler BAŞAR
Hasan Hüseyin SAÇ
İsmail Talha ILGIN
Merve ERDOĞMUŞ
Zübeyde KAYAKÇI DANIŞMAZ
In this work, we propose an architecture and interface that enables the management and configuration of a portable crypto device running on an embedded system. The developed system is also designed to be capable of performing management tasks on any embedded system. Since it is not a language dependent architecture, programming language can be changed according to platform requirements. The management system uses a database on the GNU/Linux operating system and runs the necessary commands on the embedded system via an RPC scheme. Measures have been taken for security threats in the developed system using secure transport layer. The system is designed for client and server architecture. The C++ programming language is close to the machine language. For this reason, it runs faster than other common languages. So, it is used on the server side of the management system. Since the Java isolates operating system incompatibilities, it is used on the client side. Since the desktop application uses Java in the interface, it was also developed using Java SWT library
International Conference on Cyber Security and Computer Science
ICONCS
İlhami Muharrem ORAK
O.YILDIZ
Stock market is a marketplace that facilitates buying and selling of company stocks. Finding a right time to buy/sell stock considering market movement is a tricky task to decide. Therefore, predicting the trend of stock buying/selling price is of great interest to stock traders and investors to find the right time to buy/sell stocks. This paper, aims to develop an intelligent system using Trend Estimation with Linear Regression (TELR) for predicting and visualizing the predictions. This system can guide a trader/investor however, with or without expertise in the stock market to achieve profitable investments. We have used the Stock data from Stock Exchange Bangladesh which covers 300+ companies including 29 Banks to train and test our system. We have fitted the trend with maximum likelihood estimation method to train our system with the stock data until December 2017 and then test it with the stock value of January 2018. A comparative result of the trend value derived from the intelligent system with real stock value has been presented to show the effectiveness of the Intelligent Decision System.
International Conference on Cyber Security and Computer Science
ICONCS
Md. Iftekharul Alam Efat
Rakibul Bashar
K. M. Imtiaz Uddin
T. Bhuiyan
With the advancing technology, the storage of large
amounts of data has become possible. Unstructured nature of data
makes it difficult to access. Many sectors demand access to
specific information within their area. Thus, it has emerged the
concept of vertical search engine.
In our study, a crawler was designed to filter reliable sites. The
designed crawler only adds results related to academic
publications to the database. Naive Bayes classifier algorithm was
employed to identify the science branch of an academic
publication by using its abstract. According to our experiments,
the accuracy rate of developed vertical search engine was 70%.
The application is designed in a way that it can self-learn so that
the success rate can increase.
International Conference on Cyber Security and Computer Science
ICONCS
Asım Yüksel
Muhammed Ali Karabıyık