Thursday, January 17, 2019

Threat Intelligence – Regulations


The following laws and regulations should be concerned by any U.S. banks:

Strongly recommended:
OCC (Office of the Comptroller of the Currency)
DFS 500
FFIEC (P.S.: including Handbooks and Booklets)
Swift Customer Security Program (CSP)
CHIPS
Fedline Security Controls
GLBA

Optional:
PCI-DSS (P.S.: Only for Payment Card Industry)
ISO27001/ISO27002
NIST SP800
FIPS 140-2

Wednesday, January 16, 2019

Use ProxyChains with Tor

#/etc/init.d/tor start

#vi /etc/proxychains.conf
dynamic_chain
proxy_dns
tcp_read_time_out 15000
tcp_connect_time_out 8000
[ProxyList]
socks4  127.0.0.1 9050
socks5  127.0.0.1 9050
:wq

#proxychains lynx http://v4.ifconfig.co/

Monday, January 14, 2019

Threat Intelligence - Security News

The news can refer to the websites shown as follows:
https://www.infosecurity-magazine.com/news/
https://threatpost.com/
https://securityintelligence.com/news/
https://www.securityweek.com/
https://www.cnet.com/topics/security/
https://www.bankinfosecurity.com/latest-news
https://www.darkreading.com/

Sunday, January 13, 2019

Launching a reconnaissance in Kali

#whois aaa.com
#dig aaa.com soa
#dig aaa.com ns
#dig aaa.com a
#dig aaa.com mx
#dig aaa.com txt
#fierce -dns aaa.com
#dnsrecon -d aaa.com -a --iw -z
#theharvester -d aaa.com -b all -l 1000 -h

Saturday, January 12, 2019

Set up and use Tor Client in Kali

[Install Tor Client in Kali]:
#apt-get update
#apt-cache search tor|grep '^tor'
#apt-get install tor


[Start Tor service]:
#/etc/init.d/tor start


[How to use Tor service through regular browsers]:
Point the SOCKv5 proxy to 127.0.0.1:9050, and check the option of "Using Sockv5 Proxy's DNS function".


[Use the following websites to verify if you are in Darknet]:
http://xmh57jrzrnw6insl.onion/             '''A Tor Search Engine
http://torlinkbgs6aabns.onion/             '''A darknet yellow book

Friday, January 11, 2019

Install and execute Scrapy in order to find those pages containing specific keywords

[Install Scrapy]:
#apt-get update
#apt-get install python3-scrapy


[Set up a spider]:
#scrapy startproject search_keywords              '''Here we create a project call search_keywords
#cd search_keywords
#scrapy genspider demonalex demonalex.com         '''Here we create a spider called demonalex
#cd search_keywords/spiders
#cp ./demonalex.py ./demonalex_py.bak


[Modify the spider script]:
#vi ./demonalex.py                                '''Modify the content of the spider script called demonalex.py
--------------------------------
from io import StringIO
from functools import partial
from scrapy.http import Request
from scrapy.spiders import BaseSpider
from scrapy.spiders import CrawlSpider, Rule
from scrapy.linkextractors import LinkExtractor
from scrapy.item import Item

def find_all_substrings(string, sub):

    import re
    starts = [match.start() for match in re.finditer(re.escape(sub), string)]
    return starts

class WebsiteSpider(CrawlSpider):

    name = "demonalex"                                                '''The name of the spider
    allowed_domains = ["www.phooky.com"]                              '''Here we define the domain name being crawled
    start_urls = ["http://www.phooky.com"]                            '''Here we define the start point being scanned
    rules = [Rule(LinkExtractor(), follow=True, callback="check_buzzwords")]

    crawl_count = 0
    words_found = 0                               

    def check_buzzwords(self, response):

        self.__class__.crawl_count += 1

        crawl_count = self.__class__.crawl_count

        wordlist = [                                                   '''This is a keyword list.
            "Lorem",
            "dolores",
            "feugiat",
            ]

        url = response.url
        contenttype = response.headers.get("content-type", "").decode('utf-8').lower()
        data = response.body.decode('utf-8')

        for word in wordlist:
                substrings = find_all_substrings(data, word)
                for pos in substrings:
                        ok = False
                        if not ok:
                                self.__class__.words_found += 1
                                print(word + ";" + url + ";")
        return Item()

    def _requests_to_follow(self, response):
        if getattr(response, "encoding", None) != None:
                return CrawlSpider._requests_to_follow(self, response)
        else:
                return []
--------------------------------


[Executing the spider]:
#scrapy crawl demonalex

Tuesday, January 8, 2019

[Threat Intelligence] Threat Intelligence Report's Template

1)Security News [Reader: IT,IS,RISK]
It is able to impose the security awareness of IT, IS, and RISK.

2)New Security Regulation (Specific) [Reader: IT,IS,RISK]
After the new-added regulations are aware, IT, IS, and RISK will trigger a task to revise the corresponding policies and procedures.

3)New Vulnerabilities (Specific) [Reader: IT,IS]
IT and IS should follow up by triggering a hardening process against those new-added vulnerabilities.

4)New Threats (Specific) [Reader: IT,IS]
The risk assessment team should add the new-added threats into the Threat Pool associated with Risk Assessment.

5)Data Leakage Investigation (Specific; those Data Breaches from Internet and Darknet) [Reader: IT,IS,RISK,Management]
When a Data Leakage event happens, the Incident Response process should be triggered.

6)Indicator Of Compromise (i.e. IOC) feeds (They can be added into threat detection systems) [Reader: IT,IS]
The feeds should include the categories below:
-IP Address
-Domain
-URL
-Transport-layer Port Number
-Email Address
-Filename
-File Path
-Hash(MD5 or SHA)
-String
The IOC feeds should be imported to such threat detection systems as IDS/IPS,UTM,Anti-Virus,or even SIEM.

7)Action Plan (Specific; in response to new regulation,vulnerabilities,threats,and IOCs) [Reader: IT,IS,RISK,Management]

Saturday, January 5, 2019

Utilize IPTABLES to block ports

Block a port:
#iptables -A INPUT -p tcp --dport 22 -j REJECT

See all rules:
#iptables --list

Empty all rules:
#iptables --flush