Five Common PHP database problems

Source: Internet
Author: User
Tags foreach getmessage connect sql mysql pear sql injection ruby on rails

If only one way to use the database is correct, you can create database design, database access, and database based PHP business logic code in a number of ways, but the end result is usually a mistake. This article describes the five common problems that arise in the database design and the PHP code that accesses the database, and how to fix them when they are encountered.

question 1: Use MySQL directly

A common problem is that older PHP code uses the MYSQL_ function directly to access the database. Listing 1 shows how to access the database directly.

Listing 1. access/get.php

<?php

function get_user_id ($name)

{

$db = mysql_connect (' localhost ', ' root ', ' password ');

mysql_select_db (' users ');

$res = mysql_query ("Select ID from Users WHERE login= '". $name. "");

while ($row = Mysql_fetch_array ($res)) {$id = $row [0];}

return $id;

}

Var_dump (get_user_id (' Jack '));

?>

Note Use the Mysql_connect function to access the database. Also note the query, which uses string concatenation to add $name parameters to the query.

There are two good alternatives to this technique: the PEAR DB module and the PHP Data Objects (PDO) class. Both provide abstractions from a specific database selection. As a result, your code can run on ibm®db2®, MySQL, PostgreSQL, or any other database you want to connect to without too much tweaking.

Another value of using the PEAR DB module and the PDO abstraction layer is that you can use it in SQL statements? Operator. Doing so makes SQL easier to maintain and allows your application to be protected from SQL injection attacks.

The alternative code for using PEAR DB is shown below.

Listing 2. access/get_good.php

<?php

Require_once ("db.php");

function get_user_id ($name)

{

$dsn = ' mysql://root:password@localhost/users ';

$db =& db::connect ($DSN, Array ());

if (Pear::iserror ($db)) {die ($db->getmessage ());}

$res = $db->query (' SELECT ID from users WHERE login=? ')

Array ($name));

$id = null;

while ($res->fetchinto ($row)) {$id = $row [0];}

return $id;

}

Var_dump (get_user_id (' Jack '));

?>

Note that all the places that are used directly to MySQL are eliminated except for the database connection strings in the $DSN. Besides, we passed? The operator uses $name variables in SQL. The queried data is then sent in through the array at the end of the query () method.

Issue 2: Do not use the AutoIncrement feature

Like most modern databases, MySQL can create the AutoIncrement unique identifier on a per-record basis. In addition, we'll still see the code that runs a SELECT statement first to find the largest ID, then increases that ID by 1 and finds a new record. Listing 3 shows an example of a bad pattern.

Listing 3. Badid.sql

DROP TABLE IF EXISTS users;

CREATE TABLE Users (

ID Mediumint,

Login TEXT,

Password TEXT

);

INSERT into the Users VALUES (1, ' Jack ', ' Pass ');

INSERT into the Users VALUES (2, ' Joan ', ' Pass ');

INSERT into the Users VALUES (1, ' Jane ', ' Pass ');

The ID field here is simply specified as an integer. So, even though it should be unique, we can add any value, as shown in several INSERT statements following the CREATE statement. Listing 4 shows the PHP code that adds the user to this type of pattern.

Listing 4. add_user.php

<?php

Require_once ("db.php");

function Add_user ($name, $pass)

{

$rows = Array ();

$dsn = ' Mysql://root:password@localhost/bad_badid ';

$db =& db::connect ($DSN, Array ());

if (Pear::iserror ($db)) {die ($db->getmessage ());}

$res = $db->query ("SELECT max (ID) from users");

$id = null;

while ($res->fetchinto ($row)) {$id = $row [0];}

$id + 1;

$sth = $db->prepare (INSERT into Users VALUES (?,?,?));

$db->execute ($sth, Array ($id, $name, $pass));

return $id;

}

$id = Add_user (' Jerry ', ' Pass ');

Var_dump ($id);

?>

The code in add_user.php first executes a query to find the maximum value for the ID. The file then runs an INSERT statement with an ID value plus 1. The code will fail in a race condition on a heavily loaded server. In addition, it is inefficient.

So what's the alternative? Use the AutoIncrement feature in MySQL to automatically create a unique ID for each insert. The updated pattern is shown below.

Listing 5. goodid.php

DROP TABLE IF EXISTS users;

CREATE TABLE Users (

ID mediumint not NULL auto_increment,

Login TEXT not NULL,

Password TEXT not NULL,

PRIMARY KEY (ID)

);

INSERT into the users VALUES (null, ' Jack ', ' Pass ');

INSERT into the users VALUES (null, ' Joan ', ' Pass ');

INSERT into the users VALUES (null, ' Jane ', ' Pass ');

We added the NOT NULL flag to indicate that the field must not be empty. We also added the AUTO_INCREMENT flag to indicate that the field is AutoIncrement, adding the PRIMARY KEY flag to indicate that the field is an ID. These changes speed up. Listing 6 shows the updated PHP code, which inserts the user into the table.

Listing 6. add_user_good.php

<?php

Require_once ("db.php");

function Add_user ($name, $pass)

{

$dsn = ' Mysql://root:password@localhost/good_genid ';

$db =& db::connect ($DSN, Array ());

if (Pear::iserror ($db)) {die ($db->getmessage ());}

$sth = $db->prepare ("INSERT into users VALUES (null,?,?)");

$db->execute ($sth, Array ($name, $pass));

$res = $db->query ("Select last_insert_id ()");

$id = null;

while ($res->fetchinto ($row)) {$id = $row [0];}

return $id;

}

$id = Add_user (' Jerry ', ' Pass ');

Var_dump ($id);

?>

Instead of getting the maximum ID value, I am inserting the data directly using the INSERT statement, and then using the SELECT statement to retrieve the ID of the last record inserted. This code is much simpler and more efficient than the original version and its associated schema.

Problem 3: Using multiple Databases

Occasionally, we see an application in which each table is in a separate database. This is reasonable in a very large database, but for a generic application, this level of fragmentation is not required. In addition, relational queries cannot be executed across databases, which can affect the overall idea of using a relational database, not to mention that it can be more difficult to manage tables across multiple databases.

So what should multiple databases look like? First, you need some data. Listing 7 shows the data that is divided into 4 files.

Listing 7. Database files

Files.sql:

CREATE TABLE Files (

ID Mediumint,

USER_ID Mediumint,

Name TEXT,

Path TEXT

);

Load_files.sql:

INSERT into the files VALUES (1, 1, ' test1.jpg ', ' files/test1.jpg ');

INSERT into the Files VALUES (2, 1, ' test2.jpg ', ' files/test2.jpg ');

Users.sql:

DROP TABLE IF EXISTS users;

CREATE TABLE Users (

ID Mediumint,

Login TEXT,

Password TEXT

);

Load_users.sql:

INSERT into the Users VALUES (1, ' Jack ', ' Pass ');

INSERT into the Users VALUES (2, ' Jon ', ' Pass ');

In the multiple database versions of these files, you should load the SQL statements into a database, and then load the users SQL statements into another database. The PHP code used to query the database for files associated with a particular user is shown below.

Listing 8. getfiles.php

<?php

Require_once ("db.php");

function Get_user ($name)

{

$dsn = ' mysql://root:password@localhost/bad_multi1 ';

$db =& db::connect ($DSN, Array ());

if (Pear::iserror ($db)) {die ($db->getmessage ());}

$res = $db->query ("SELECT ID from users WHERE login=?")

Array ($name));

$uid = null;

while ($res->fetchinto ($row)) {$uid = $row [0];}

return $uid;

}

function Get_files ($name)

{

$uid = Get_user ($name);

$rows = Array ();

$dsn = ' Mysql://root:password@localhost/bad_multi2 ';

$db =& db::connect ($DSN, Array ());

if (Pear::iserror ($db)) {die ($db->getmessage ());}

$res = $db->query ("SELECT * from Files WHERE user_id=?",

Array ($uid));

while ($res->fetchinto ($row)) {$rows [] = $row;}

return $rows;

}

$files = Get_files (' Jack ');

Var_dump ($files);

?>

The Get_user function connects to the database that contains the user table and retrieves the ID of the given user. The Get_files function connects to the file table and retrieves the file rows associated with the given user.

A better way to do all of these things is to load the data into a database, and then execute the query, such as the following query.

Listing 9. getfiles_good.php

<?php

Require_once ("db.php");

function Get_files ($name)

{

$rows = Array ();

$dsn = ' Mysql://root:password@localhost/good_multi ';

$db =& db::connect ($DSN, Array ());

if (Pear::iserror ($db)) {die ($db->getmessage ());}

$res = $db->query (

"Select files.* from users, files WHERE

Users.login=? and users.id=files.user_id ",

Array ($name));

while ($res->fetchinto ($row)) {$rows [] = $row;}

return $rows;

}

$files = Get_files (' Jack ');

Var_dump ($files);

?>

The code is not only shorter, but also easier to understand and efficient. Instead of executing two queries, we execute a query.

Although the problem may sound far-fetched, in practice we usually conclude that all tables should be in the same database, unless there are compelling reasons.

Issue 4: Do not use relationships

Relational databases are different from programming languages and do not have an array type. Instead, they use the relationships between tables to create one to many structures between objects, which has the same effect as an array. One problem I see in my application is that an agent tries to use a database as a programming language, by creating an array by using a text string with a comma-delimited identifier. Take a look at the following pattern.

Listing 10. Bad.sql

DROP TABLE IF EXISTS files;

CREATE TABLE Files (

ID Mediumint,

Name TEXT,

Path TEXT

);

DROP TABLE IF EXISTS users;

CREATE TABLE Users (

ID Mediumint,

Login TEXT,

Password TEXT,

Files TEXT

);

INSERT into Files VALUES (1, ' test1.jpg ', ' media/test1.jpg ');

INSERT into Files VALUES (2, ' test1.jpg ', ' media/test1.jpg ');

INSERT into the Users VALUES (1, ' Jack ', ' Pass ', ' 1,2 ');

A user in the system can have more than one file. In programming languages, you should use arrays to represent files associated with a user. In this case, the programmer chooses to create a files field that contains a comma-delimited list of file IDs. To get a list of all the files for a particular user, the programmer must first read the rows from the user table, then parse the text of the file, and run a separate SELECT statement for each file. The code looks like the following.

Listing 11. get.php

<?php

Require_once ("db.php");

function Get_files ($name)

{

$dsn = ' Mysql://root:password@localhost/bad_norel ';

$db =& db::connect ($DSN, Array ());

if (Pear::iserror ($db)) {die ($db->getmessage ());}

$res = $db->query ("SELECT files from users WHERE login=?")

Array ($name));

$files = null;

while ($res->fetchinto ($row)) {$files = $row [0];}

$rows = Array ();

foreach (Split (', ', $files) as $file)

{

$res = $db->query ("SELECT * from Files WHERE id=?",

Array ($file));

while ($res->fetchinto ($row)) {$rows [] = $row;}

}

return $rows;

}

$files = Get_files (' Jack ');

Var_dump ($files);

?>

The technology is slow, difficult to maintain, and does not make good use of the database. The only solution is to redesign the schema to transform it back into the traditional relational form, as shown below.

Listing 12. Good.sql

DROP TABLE IF EXISTS files;

CREATE TABLE Files (

ID Mediumint,

USER_ID Mediumint,

Name TEXT,

Path TEXT

);

DROP TABLE IF EXISTS users;

CREATE TABLE Users (

ID Mediumint,

Login TEXT,

Password TEXT

);

INSERT into the Users VALUES (1, ' Jack ', ' Pass ');

INSERT into the files VALUES (1, 1, ' test1.jpg ', ' media/test1.jpg ');

INSERT into the Files VALUES (2, 1, ' test1.jpg ', ' media/test1.jpg ');

Here, each file is related to the user in the file table through the USER_ID function. This may be contrary to the idea of anyone who sees multiple files as an array. Of course, the array does not reference the objects it contains--in fact, and vice versa. But in relational databases, that's how it works, and queries are so much faster and simpler. Listing 13 shows the corresponding PHP code.

Listing 13. get_good.php

<?php

Require_once ("db.php");

function Get_files ($name)

{

$dsn = ' Mysql://root:password@localhost/good_rel ';

$db =& db::connect ($DSN, Array ());

if (Pear::iserror ($db)) {die ($db->getmessage ());}

$rows = Array ();

$res = $db->query (

"Select files.* from Users,files WHERE users.login=?"

and users.id=files.user_id ",

Array ($name));

while ($res->fetchinto ($row)) {$rows [] = $row;}

return $rows;

}

$files = Get_files (' Jack ');

Var_dump ($files);

?>

Here, we make a query to the database to get all the rows. The code is not complex, and it uses the database as its original purpose.

problem 5:n+1 Mode

I don't know how many times I've seen such a large application, where the code first retrieves some entities (such as customers) and then retrieves them one by one to get the details of each entity. We call it the N+1 mode because the query is executed so many times--a query retrieves a list of all the entities, and then executes a query for each of the N entities. It's not a problem when it comes to n=10, but when n=100 or n=1000? Then there is definitely inefficiency. Listing 14 shows an example of this pattern.

Listing 14. Schema.sql

DROP TABLE IF EXISTS authors;

CREATE TABLE Authors (

ID mediumint not NULL auto_increment,

Name TEXT not NULL,

PRIMARY KEY (ID)

);

DROP TABLE IF EXISTS Books;

CREATE TABLE Books (

ID mediumint not NULL auto_increment,

author_id Mediumint not NULL,

Name TEXT not NULL,

PRIMARY KEY (ID)

);

INSERT into authors VALUES (null, ' Jack Herrington ');

INSERT into authors VALUES (null, ' Dave Thomas ');

INSERT into the books VALUES (null, 1, ' Code Generation in Action ');

INSERT into books VALUES (null, 1, ' podcasting Hacks ');

INSERT into books VALUES (null, 1, ' PHP Hacks ');

INSERT into books VALUES (null, 2, ' pragmatic Programmer ');

INSERT into books VALUES (null, 2, ' Ruby on Rails ');

INSERT into books VALUES (null, 2, ' programming Ruby ');

The pattern is reliable, without any errors. The problem is in the code that accesses the database to find all the books for a given author, as shown below.

Listing 15. get.php

<?php

Require_once (' db.php ');

$dsn = ' mysql://root:password@localhost/good_books ';

$db =& db::connect ($DSN, Array ());

if (Pear::iserror ($db)) {die ($db->getmessage ());}

function get_author_id ($name)

{

Global $db;

$res = $db->query ("SELECT ID from authors WHERE name=?")

Array ($name));

$id = null;

while ($res->fetchinto ($row)) {$id = $row [0];}

return $id;

}

function Get_books ($id)

{

Global $db;

$res = $db->query ("SELECT ID from books WHERE author_id=?",

Array ($id));

$ids = Array ();

while ($res->fetchinto ($row)) {$ids []= $row [0];}

return $ids;

}

function Get_book ($id)

{

Global $db;

$res = $db->query ("SELECT * from books WHERE id=?", Array ($id));

while ($res->fetchinto ($row)) {return $row;}

return null;

}

$author _id = get_author_id (' Jack Herrington ');

$books = Get_books ($author _id);

foreach ($books as $book _id) {

$book = Get_book ($book _id);

Var_dump ($book);

}

?>

If you look at the following code, you might think, "Hey, that's really clear." "First, get the author ID, then get a list of books, and then get information about each book." Yes, it's clear, but is it efficient? The answer is No. See just how many queries to perform when retrieving Jack Herrington's books. Get IDs one at a time, get a list of books, and then execute a query for each book. Three books to execute five times query!

The solution is to use a function to execute a large number of queries, as shown below.

Listing 16. get_good.php

<?php

Require_once (' db.php ');

$dsn = ' mysql://root:password@localhost/good_books ';

$db =& db::connect ($DSN, Array ());

if (Pear::iserror ($db)) {die ($db->getmessage ());}

function Get_books ($name)

{

Global $db;

$res = $db->query (

"Select books.* from Authors,books WHERE

Books.author_id=authors.id and authors.name=? ",

Array ($name));

$rows = Array ();

while ($res->fetchinto ($row)) {$rows []= $row;}

return $rows;

}

$books = Get_books (' Jack Herrington ');

Var_dump ($books);

?>

Retrieving the list now requires a quick, individual query. This means that I will most likely have to have several methods with different parameters for these types, but there is no choice. If you want to have an extended PHP application, you must use the database effectively, which means more intelligent queries.

The problem with this example is that it's a little too clear. Typically, these types of n+1 or n*n problems are much more subtle. And they only appear when the database administrator runs the query Profiler on the system when the system has a performance problem.

Concluding remarks

A database is a powerful tool, just like any powerful tool, if you don't know how to use it correctly, you abuse it. The trick to identifying and solving these problems is to better understand the underlying technology. For a long time, I've heard business logic writers complaining that they don't want to have to understand the database or the SQL code. They use the database as objects and wonder why the performance is so poor.

They did not realize how important it was to understand SQL to transform the database from a difficult necessity into a powerful alliance. If you use a database every day, but you are unfamiliar with SQL, read the Art of SQL, which is well written and practical and can guide you to a basic understanding of the database.



Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.