Parsing mysql: Remove duplicate records for single-Table distinct and multi-Table groupby queries

Source: Internet
Author: User
Tags mysql manual
Unique query for a single table: distinct unique query for multiple tables: When groupbydistinct is used to query multiple tables, leftjoin is still valid and full join is invalid. when mysql is used, sometimes you need to query records that do not repeat a field. Although mysql provides the distinct keyword to filter out redundant duplicate records, only one record is retained

Unique query for a single table: Unique query for multiple tables in distinct: When group by distinct is used to query multiple tables, left join is still valid and full join is invalid. when mysql is used, sometimes you need to query records that do not repeat a field. Although mysql provides the distinct keyword to filter out redundant duplicate records, only one record is retained

Unique query for a single table: distinct
Unique query for multiple tables: group
When distinct queries multiple tables, left join is still valid and full join is invalid,
When using mysql, you sometimes need to query records with unique fields. Although mysql provides the keyword distinct to filter out redundant duplicate records, only one record is retained, but it is often used to return the number of records that do not repeat, instead of returning all values of records that do not repeat. The reason is that distinct can only return its target field, but cannot return other fields. If distinct cannot solve the problem, I only use double loop query to solve the problem, this will undoubtedly directly affect the efficiency of a station with a large data volume.
Let's take a look at the example below:
The table structure is as follows:
Id name
1
2 B
3 c
4 c
5 B
The structure of the basic table is like this. This is just a simple example. The actual multi-Table query is much more complicated.
For example, if you want to use a statement to query all data with no duplicate names, you must use distinct to remove redundant duplicate records.
Select distinct name from table
The result is:
Name
A

C
It seems that the effect has been achieved, but what I want to get is the id value? Modify the query statement:
Select distinct name, id from table
The result is:
Id name
1
2 B
3 c
4 c
5 B
How does distinct not work? The role is actually effective, but it also serves two fields, that is, it must have the same id and name to be excluded.
Modify the query statement again:
Select id, distinct name from table
Unfortunately, you cannot get anything except the error message. You must start with distinct. Is it difficult to place distinct in the where condition? Try again and report an error.

I tried other methods that I could think of for a long time. I finally found a usage in the mysql manual and used group_concat (distinct name) and group by name to implement the functions I needed, i'm so excited. You can try it now.
Error reported, depressing!
I can't even go through the mysql manual. I gave me hope first, and then pushed me down.
Check again. The group_concat function is supported by 4.1, dizzy. I have 4.0. No way. Upgrade. The upgrade is successful.
Finally, the customer must be asked to upgrade.
Suddenly, the ghost machine flashed. Since the group_concat function can be used, can other functions be used?
Hurry and use the count function for a try. It's as easy as it takes so much time.
Now release the complete statement:
Select *, count (distinct name) from table group by name
Result:
Id name count (distinct name)
1 a 1
2 B 1
3 c 1
The last item is redundant, so you don't have to worry about it.
It turned out that mysql was so stupid that I would just lie to him. Now I hope you will not be confused by this problem.
By the way, group by must be placed before order by and limit, otherwise an error will be reported.
Let's talk about the actual example of group:

The Code is as follows:


$ SQL = 'select DISTINCT n. nid, tn. tid, n. title, n. created, ni. thumbpath from {term_node} tn inner join {node} n ON n. nid = tn. nid inner join {node_images} ni ON ni. nid = n. nid where tn. tid IN ('. implode (',', $ tids ). ') order by n. nid DESC ';
$ Res = db_query ($ SQL );
$ T_data = array ();
While ($ r = db_fetch_array ($ res )){
Print_r ($ r );
}


When using this query statement, there will always be two identical NIDs, such as the following results

The Code is as follows:


Array
(
[Created] = & gt; 1215331278
[Nid] = & gt; 1603
[Tid] => 32
[Title] => DIY green qinyin for summer weddings
[Thumbpath] => files/node_images/home-77.1_tn.jpg
)
Array
(
[Created] = & gt; 1215331278
[Nid] = & gt; 1603
[Tid] => 32
[Title] => DIY green qinyin for summer weddings
[Thumbpath] => files/node_images/003_primary_tn.jpg
)


DISTINCT is not used as it works, but nid is unique in the query structure.
Finally, group by is used.

The Code is as follows:


$ SQL = 'select
N. nid, tn. tid, n. title, n. created, ni. thumbpath from {term_node} tn INNER
JOIN {node} n ON n. nid = tn. nid inner join {node_images} ni ON
Ni. nid = n. nid where tn. tid IN ('. implode (', ', $ tids).') GROUP
N. nid DESC ';
$ Res = db_query ($ SQL );
$ T_data = array ();
While ($ r = db_fetch_array ($ res )){
Print_r ($ r );
}


The nid is unique.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.