General info - 6350958828e0b
Add new column of id :
ALTER TABLE spam_alter_email ADD `id` INT NOT NULL AUTO_INCREMENT PRIMARY KEY FIRST ;
exporting large db with ternimal :
mysqldump --max_allowed_packet=1024M -u root -p database > dumpfile.sql
/home/rajeshwar/Videos/Cluster_query/2023-10-06/Cluster_New_update
Gravtar logo : https://ui-avatars.com/api/?name=John+Doe&background=random
load data from file into database :
LOAD DATA INFILE '/home/export_file.csv'
INTO TABLE table_name
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '/n'
IGNORE 1 ROWS;
download large set of databse :
mysqldump -h 127.0.0.1 -u root -p spam_alter > database-dump-alter.sql
Note for all this :
1: we have to modify max_upload_file_size , memory_size in the php.ini file.
2:php.ini file lives under the composer folder and then php version folder.
plocation key : 2c0bc0ca84f049c9ac43848eb0189e30
URL : $ip = '8.8.8.8'
connect mysql in rajeshwar hosting :
mysql -h 127.0.0.1 -u db_user -p'db_pass'
coping file from local to remote :
scp ./dump_databse.tar rajeshwar@138.68.44.204:web/amma14397.ml/public_html
Cloning or Copying a Table:
CREATE TABLE new_table LIKE original_table;
Cloning of table with data :
CREATE TABLE new_table SELECT * FROM original_table;
change the directory and all it's file permission at once :
sudo chmod -R /folder_name
Remove diamond type special characters:
array_push($row,preg_replace('/[\x00-\x1F\x7F-\xFF]/', '',$v));
uploading large csv file to remote sql server :
mysql -h<server|ip> -u<username> -p --local-infile bark -e "LOAD DATA LOCAL INFILE '<filename.csv>' INTO TABLE <table> FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n'"
Adding index on column :
mysql -h 138.68.44.204 -u rajeshwar_s_trap -p's_trap123' --local-infile rajeshwar_s_trap -e "ALTER TABLE spam_emails ADD INDEX spm_email (email(767))"
Truncate table
mysql -h 127.0.0.1 -u root -p'' --local-infile spam_traps -e "truncate table uploadData"
dump large sql file in remote server
mysql -h yourhostname -u username -p databasename < yoursqlfile.sql
dump in local machine
mysqldump -h hostname-of-the-server -u mysql_user -p database_name > file.sql
https://lite.ip2location.com>>Raj14397@
allavsoft key :
1EC8-0D98-F0C4-36EB-AA53-F85C-93E6-40FC
A5DE-40A8-4747-BA2E-6E3D-40F8-FCCF-E863
BBD1-93C4-5564-9C19-2C13-AA6E-F6A9-C5DD
F32C-B0AB-1BC0-64A1-F2BE-2E83-6E87-C68C
7EAF-5E6C-AC9C-1A88-753B-E5C7-9F36-3B08
90_days_journey git project
git remote add origin git@github.com:NRKRaj14397/90_days_journey.git git branch -M main git push -u origin main
Select a random row with MySQL:
SELECT column FROM table
ORDER BY RAND()
LIMIT 1
Get mac-address :
inxi -Nn
remove all quotes:
UPDATE Table_Name
SET col_name = REPLACE(col_name, '"', '')
Import large file in database local :
mysql -h 127.0.0.1 -u root -p'' --local-infile Linkedin_DB_combined -e "LOAD DATA LOCAL INFILE 'Linkedin_DB_combined_34942771.csv' INTO TABLE records FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' IGNORE 1 ROWS"
Cron Job setup in vestacp :
If you want to disable permanently error "Expression #N of SELECT list is not in GROUP BY clause and contains nonaggregated column 'db.table.COL' which is not functionally dependent on columns in GROUP BY clause; this is incompatible with sql_mode=only_full_group_by" do those steps:
sudo nano /etc/mysql/my.cnf
Add this to the end of the file
[mysqld] sql_mode = "STRICT_TRANS_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION"
sudo service mysql restart
to restart MySQL
This will disable ONLY_FULL_GROUP_BY
for ALL users
zip forlder : zip -r archivename.zip directory_name
Remove index from table :
ALTER TABLE table_name
DROP INDEX index_name;
To make an index on table :
1) do not indexed multiple time a table column
2) remove all previous indexed from table column then add new one.
How To Enable MySQL Query Cache
MySQL query cache stores query results of frequently and recently run queries so they can be returned quickly without processing everything from scratch. They are very useful in improving query speed and database performance. Here are the steps to enable MySQL query cache for your database, set MySQL cache size and disable query cache. Please note, MySQL query cache has been deprecated in MySQL 5.7 and removed in MySQL 8.0.
How To Enable MySQL Query Cache
Here is MySQL Query cache configuration for your database.
1. Check Availability
Log into MySQL and run the following command to check if MySQL query cache is available for your database.
mysql> SHOW VARIABLES LIKE 'have_query_cache'; +------------------+-------+ | Variable_name | Value | +------------------+-------+ | have_query_cache | YES | +------------------+-------+
If you get the result as YES, it means your database supports query cache.
Bonus Read : How to Enable MySQL Slow Query Log
2. Default Query Cache Variables
MySQL query cache is controlled by many variables. Run the following command to view their default values first, before changing them.
mysql> show variables like 'query_cache_%' ;
You will see the following output
+------------------------------+----------+ | Variable_name | Value | +------------------------------+----------+ | query_cache_limit | 1048576 | | query_cache_min_res_unit | 4096 | | query_cache_size | 16777216 | | query_cache_type | OFF | | query_cache_wlock_invalidate | OFF | +------------------------------+----------+
Let us look at the above query cache variables,
- query_cache_limit – maximum size of query results that can be cached
- query_cache_min_res_result – MySQL stores query result in blocks. This is the minimum size of each block.
- query_cache_size – indicates the total amount of memory allocated for MySQL cache.
- query_cache_type – setting this to 0 or OFF disables MySQL query cache. setting it to 1 enables query cache.
- query_cache_wlock_invalidate – determines if MySQL should fetch results from cache if the underlying table is locked.
Bonus Read : Top 5 MySQL Monitoring Tools
3. Enable MySQL Query Cache
Log out of MySQL. Open terminal and run the following command to open MySQL configuration file.
$ sudo vi /etc/mysql/my.cnf
Add the following lines below [mysqld] section
...
[mysqld]
query_cache_type=1
query_cache_size = 10M
query_cache_limit=256K
We have enabled query cache by setting query_cache_type variable to 1, with individual query cache as 256Kb and total cache as 10Mb. You can change the values of query_cache_size and query_cache_limit as per your requirements.
Bonus Read : How to Speed Up MySQL Queries
4. Restart MySQL database
Restart MySQL to apply changes
$ sudo systemctl restart mysql
OR
$ sudo service mysql restart
OR
$ sudo /etc/init.d/mysql restart
Hopefully, this article will help you with MySQL query cache configuration.
Either remove the DEFINER=..
statement from your sqldump file, or replace the user values with CURRENT_USER
.
The MySQL server provided by RDS does not allow a DEFINER
syntax for another user (in my experience).
You can use a sed
script to remove them from the file:
sed 's/\sDEFINER=`[^`]*`@`[^`]*`//g' -i oldfile.sql
Encoding error to UTF-8
With php 7.2, two options allow to manage invalid UTF-8 direcly in json_encode :
https://www.php.net/manual/en/function.json-encode
json_encode($text, JSON_INVALID_UTF8_IGNORE);
Or
json_encode($text, JSON_INVALID_UTF8_SUBSTITUTE);
Server-side processing - Ajax
<div class="container"> <h2>jQuery Datatables Server Side Processing with PHP and MySQL</h2> <table id="developers" class="display" width="100%" cellspacing="0"> <thead> <tr> <th>Id</th> <th>Name</th> <th>Age</th> <th>Gender</th> <th>Address</th> <th>Designation</th> <th>Skills</th> </tr> </thead> </table> </div>
jQuery( document ).ready(function() {
var table = jQuery('#developers').dataTable({
"bProcessing": true,
"sAjaxSource": "load_data.php",
"bPaginate":true,
"sPaginationType":"full_numbers",
"iDisplayLength": 5,
"aoColumns": [
{ mData: 'id' } ,
{ mData: 'name' },
{ mData: 'age' },
{ mData: 'gender' },
{ mData: 'address' },
{ mData: 'designation' },
{ mData: 'skills' }
]
});
});
<?php $sql_query = "SELECT id, name, gender, skills, address, designation, age FROM developers LIMIT 20"; $resultset = mysqli_query($conn, $sql_query) or die("database error:". mysqli_error($conn)); $developers_record = array(); while( $developer = mysqli_fetch_assoc($resultset) ) { $developers_record[] = $developer; } $developer_data = array( "sEcho" => 1, "iTotalRecords" => count($developers_record), "iTotalDisplayRecords" => count($developers_record), "aaData"=>$developers_record); echo json_encode($developer_data); ?>
Create View
CREATE view productlist AS SELECT brand.brand_name, product.product_name FROM brand INNER JOIN product ON product.brand_id = brand.brand_id
Update View
CREATE OR REPLACE view productlist AS SELECT brand.brand_id, brand.brand_name, product.product_name FROM brand INNER JOIN product ON product.brand_id = brand.brand_id
Search bar :
Get all the duplicate records :
SELECT
email, COUNT(email)
FROM
refresh_db WHERE project_uuid='625525dc37c2c'
GROUP BY
email
HAVING
COUNT(email) > 1
Delete duplicate
DELETE t1 FROM refresh_db t1 INNER JOIN refresh_db t2 WHERE t1.project_uuid='625525dc37c2c' AND
t1.id < t2.id AND
t1.email = t2.email
{
"message":"You cannot consume this service"
}
Enclose with single quotes
="'"&A2&"',"
write a summary for https://enterprise-guide.s3.amazonaws.com/resources/2.pdf with 1 header, 1 sub-header and main-content with 20 lines:
New line to <br> conversion:
// jQuery for example
$("div").css('white-space', 'pre-wrap')
.text('First Line\nSecond Line\n ---End');
Multiple csv files into one single file:
cat *csv | awk '!a[$0]++' > summary.csv
removing element from array of object with matching object property:
arr = arr.filter(item => item.key !== "some value");
Download csv in js
Comments
Post a Comment