Future of SSLStrip-HTTPS front-end hijacking (1)

Source: Internet
Author: User

Future of SSLStrip-HTTPS front-end hijacking (1)

0x00 Preface

In the previous article on traffic hijacking, we mentioned a scheme of "HTTPS downgrading"-replacing all the HTTPS hyperlinks on the page with the HTTP Version, allows users to communicate in plain text.

If you see this, you may think of a classic man-in-the-middle attack tool, SSLStrip, which can indeed achieve this effect.

However, what we have explained today is completely different ideas. A more effective and advanced solution-HTTPS front-end hijacking.

0x01 backend Defects

In the past, traffic hijacking was basically implemented through the backend. SSLStrip is a typical example.

Similar to other man-in-the-middle tools, pure backend implementation can only manipulate the most primitive traffic data, which seriously hinders the development to a higher level and faces many difficult problems.

What about dynamic elements?

How to handle data packet sharding?

Can performance consumption be reduced?

......

Dynamic Element

In the age of Web, tools such as SSLStrip were quite useful. At that time, the web pages were mostly static, and the structure was simple and clear. Replace traffic to be fully qualified.

However, today's web pages are increasingly complex, and scripts account for an increasing proportion. If we start from the traffic, it is obvious that we cannot do anything.

 
 
  1. var protocol = 'https';  
  2. document.write('<a href="' + protocol + '://www.alipay.com/">Login</a>'); 

Even if it is a very simple dynamic element, the backend is useless.

Shard Processing

We all understand the principle of multipart transmission. For large data, it is impossible to pass it through in one breath. Each data block is received by the client in turn before it can be merged into a complete web page.

Each time you receive incomplete fragments, this causes a lot of trouble for Link replacement. The addition of many pages is not the standard UTF-8 code, so it is more difficult to refer.

To ensure smooth operation, middlemen usually collect data and wait until the page is fully received before replacement.

If we compare data to water flow, this agent intercepts a steady flow of water, just like a dam, until it is fully stocked. Therefore, downstream people need to endure a long drought before they can wait for water sources.

Performance consumption

Because HTML is compatible with many legacy rules, replacement is not easy.

Various complex Regular Expressions consume a lot of CPU resources. Even though the user finally clicks only one or two links, the man-in-the-middle does not know which one will be, so the entire page still needs to be analyzed. This is a sorrow.


Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.