Welcome to the GoFuckYourself.com - Adult Webmaster Forum forums.

You are currently viewing our boards as a guest which gives you limited access to view most discussions and access our other features. By joining our free community you will have access to post topics, communicate privately with other members (PM), respond to polls, upload content and access many other special features. Registration is fast, simple and absolutely free so please, join our community today!

If you have any problems with the registration process or your account login, please contact us.

Post New Thread Reply

Register GFY Rules Calendar Mark Forums Read
Go Back   GoFuckYourself.com - Adult Webmaster Forum > >
Discuss what's fucking going on, and which programs are best and worst. One-time "program" announcements from "established" webmasters are allowed.

 
Thread Tools
Old 09-06-2017, 08:38 PM   #101
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380
And this

Code:
root@ubuntu-2gb-blr1-14-04-3:~# sudo mkdir /var/mysqltmp
mkdir: cannot create directory ?/var/mysqltmp?: No space left on device
How is it possible that sites that in 5 years have reached 12 GB now fill 40gb in a few days???

What are you eating all this space?
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-06-2017, 10:16 PM   #102
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380
Maybe now it works...

A lighting crossed my mind while I was pooping...

And I remembered having activated it in its time: Standard HTTP Caching
https://www.digitalocean.com/communi...n-ubuntu-14-04

In which I remembered that there were some lines, like these:
CacheLockPath /tmp/mod_cache-lock
CacheEnable disk

I had added them to the virtual host file in the first site, just to try, and when I added the new domains (copying the config file) I copied them to all new sites...

I removed them from the .conf files and for the moment everything seems to be back to work.

I restarted the server, but I probably have not freed all the space used by these caches, I have to understand how to do it... and disable them permanently...

Check in the next few days to see if everything works...
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-07-2017, 05:52 PM   #103
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380
What does updatedb.mlocat?

It is kinda that I observe this graph on digitalocean the second service that sucks resources is always updatedb.mlocat.



I searched a bit in Google, I found many guides on how to disable it, or remove it, or delete it from the cronjob.

But after many searches I have not yet figured out what it does and if it is a necessary service...

Someone can tell me what it is and what it does?

And if it can be disabled?

Strange thing, I see it only on Digitalocean, Nixstat does not show it

porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-07-2017, 10:54 PM   #104
Barry-xlovecam
It's 42
 
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
Locate is a system tool used like find
update db.mlocat is its database used

Code:
barry@paragon-DS-7:/$ locate apache|grep error
/etc/apache2/conf-available/localized-error-pages.conf
/etc/apache2/conf-enabled/localized-error-pages.conf
/home/barry/server-host-configuration/domains/apache-sites/sites-available/apache-error-module
/usr/share/apache2/error
/usr/share/apache2/error/HTTP_BAD_GATEWAY.html.var
/usr/share/apache2/error/HTTP_BAD_REQUEST.html.var
/usr/share/apache2/error/HTTP_FORBIDDEN.html.var
/usr/share/apache2/error/HTTP_GONE.html.var
/usr/share/apache2/error/HTTP_INTERNAL_SERVER_ERROR.html.var
/usr/share/apache2/error/HTTP_LENGTH_REQUIRED.html.var
/usr/share/apache2/error/HTTP_METHOD_NOT_ALLOWED.html.var
/usr/share/apache2/error/HTTP_NOT_FOUND.html.var
/usr/share/apache2/error/HTTP_NOT_IMPLEMENTED.html.var
/usr/share/apache2/error/HTTP_PRECONDITION_FAILED.html.var
/usr/share/apache2/error/HTTP_REQUEST_ENTITY_TOO_LARGE.html.var
/usr/share/apache2/error/HTTP_REQUEST_TIME_OUT.html.var
/usr/share/apache2/error/HTTP_REQUEST_URI_TOO_LARGE.html.var
/usr/share/apache2/error/HTTP_SERVICE_UNAVAILABLE.html.var
/usr/share/apache2/error/HTTP_UNAUTHORIZED.html.var
/usr/share/apache2/error/HTTP_UNSUPPORTED_MEDIA_TYPE.html.var
/usr/share/apache2/error/HTTP_VARIANT_ALSO_VARIES.html.var
/usr/share/apache2/error/README
/usr/share/apache2/error/contact.html.var
/usr/share/apache2/error/include
/usr/share/apache2/error/include/bottom.html
/usr/share/apache2/error/include/spacer.html
/usr/share/apache2/error/include/top.html
/var/lib/apache2/conf/enabled_by_maint/localized-error-pages
/var/log/apache2/error.log
/var/log/apache2/error.log.1
/var/log/apache2/error.log.10.gz
/var/log/apache2/error.log.11.gz
/var/log/apache2/error.log.12.gz
/var/log/apache2/error.log.13.gz
/var/log/apache2/error.log.14.gz
/var/log/apache2/error.log.2.gz
/var/log/apache2/error.log.3.gz
/var/log/apache2/error.log.4.gz
/var/log/apache2/error.log.5.gz
/var/log/apache2/error.log.6.gz
/var/log/apache2/error.log.7.gz
/var/log/apache2/error.log.8.gz
/var/log/apache2/error.log.9.gz
Code:
barry@paragon-DS-7:/$ apt search mlocat
Sorting... Done
Full Text Search... Done

mlocate/xenial,now 0.26-1ubuntu2 amd64 [installed,automatic]
  quickly find files on the filesystem based on their name
Use apt search to get information on installed processes, dameons or programs installed in Debian/Ubuntu LINUX

sudo apt autoremove mlocate
sudo apt purge mlocate
if you really want to remove mlocate.

Code:
barry@paragon-DS-7:/$ locate mlocat
/etc/cron.daily/mlocate
/usr/bin/mlocate
/usr/bin/updatedb.mlocate
/usr/share/doc/mlocate
/usr/share/doc/mlocate/AUTHORS
/usr/share/doc/mlocate/NEWS.gz
/usr/share/doc/mlocate/README
/usr/share/doc/mlocate/TODO.Debian
/usr/share/doc/mlocate/changelog.Debian.gz
/usr/share/doc/mlocate/copyright
/usr/share/locale-langpack/en_AU/LC_MESSAGES/mlocate.mo
/usr/share/locale-langpack/en_GB/LC_MESSAGES/mlocate.mo
/usr/share/man/man1/mlocate.1.gz
/usr/share/man/man5/mlocate.db.5.gz
/var/lib/mlocate
/var/lib/dpkg/info/mlocate.conffiles
/var/lib/dpkg/info/mlocate.list
/var/lib/dpkg/info/mlocate.md5sums
/var/lib/dpkg/info/mlocate.postinst
/var/lib/dpkg/info/mlocate.postrm
/var/lib/dpkg/info/mlocate.prerm
/var/lib/mlocate/mlocate.db
/var/lib/mlocate/mlocate.db.CBhsCO
barry@paragon-DS-7:/$
When the Mongols lead by the Great Khan invaded civilization they destroyed the farming infrastructure without regard to importance to the peasants survival because as nomads they did not understand what it was.

cat /usr/share/doc/mlocate/README
About
=====
mlocate is a locate/updatedb implementation. The 'm' stands for "merging":
updatedb reuses the existing database to avoid rereading most of the file
system, which makes updatedb faster and does not trash the system caches as
much.

The locate(1) utility is intended to be completely compatible to slocate. It
also attempts to be compatible to GNU locate, when it does not conflict with
slocate compatibility.

New releases will be available at https://fedorahosted.org/mlocate/ .

Installation
============
Before installation it is necessary to create a group called "mlocate" to allow
hiding the contents of the database from users.

When updatedb is run by root, the database contains names of files of all
users, but only members of the "mlocate" group may access it. "locate" is
installed set-GID "mlocate", no other program should need to run with this GID.

Portability
===========
mlocate should be portable to all SUSv3-compliant UNIXes, although it is
currently tested only on recent Linux distributions.

Bugs
====
Please consider reporting the bug to your distribution's bug tracking system.
Barry-xlovecam is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-08-2017, 03:52 PM   #105
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380
So maybe for now I keep it... anyway CPUs we still have...

The very serious problem that I thought I had solved and instead I still is the space...

I know that all my sites together (on cpanel other server) weigh about 10-12 Gb and in this new server I have not yet transferred the two heavier.

Today trying to decompress a file of about 300Mb I received the message out of space.

And I can't even figure out if it's true that space is exhausted...

When login:

Code:
  System load:  0.31               Processes:           105
  Usage of /:   37.1% of 39.34GB   Users logged in:     0
  Memory usage: 8%                 IP address for eth0: 139.59.71.64
  Swap usage:   0%
Digitalocean monitor



Nixstats monitor




Code:
root@ubuntu-2gb-blr1-14-04-3:~# df -h
Filesystem      Size  Used Avail Use% Mounted on
udev            997M   12K  997M   1% /dev
tmpfs           201M  364K  200M   1% /run
/dev/vda1        40G   15G   24G  39% /
none            4.0K     0  4.0K   0% /sys/fs/cgroup
none            5.0M     0  5.0M   0% /run/lock
none           1001M     0 1001M   0% /run/shm
none            100M     0  100M   0% /run/user
Code:
root@ubuntu-2gb-blr1-14-04-3:~# du -max / | sort -rn | head -20
15108	/
13885	/var
10806	/var/cache
10655	/var/cache/apache2/mod_cache_disk
10655	/var/cache/apache2
1216	/var/log
988	/var/www/html
988	/var/www
921	/usr
882	/var/log/apache2
874	/var/lib
499	/var/lib/mysql
445	/var/log/apache2/access.log
372	/var/log/apache2/access.log.1
353	/usr/lib
296	/var/www/html/cdn.zip
263	/usr/share
242	/var/www/html/bigboobsupdate.com
218	/var/lib/mlocate/mlocate.db
218	/var/lib/mlocate
Code:
root@ubuntu-2gb-blr1-14-04-3:~# sudo du -sxm /var/* | sort -nr | head -n 15
10819	/var/cache
1209	/var/log
1021	/var/www
873	/var/lib
2	/var/backups
1	/var/tmp
1	/var/spool
1	/var/opt
1	/var/mail
1	/var/local
1	/var/crash
0	/var/run
0	/var/lock

root@ubuntu-2gb-blr1-14-04-3:~# sudo du -sxm /var/cache/* | sort -nr | head -n 15
10668	/var/cache/apache2
89	/var/cache/apt-xapian-index
56	/var/cache/apt
6	/var/cache/debconf
2	/var/cache/man
1	/var/cache/pppconfig
1	/var/cache/pollinate
1	/var/cache/ldconfig
1	/var/cache/dbconfig-common
1	/var/cache/apparmor
I just can't understand what is eating all this space, and I can't do anything until I solve this problem, not even to transfer the last 2 sites.

I no longer have the problem of the other day when the sites showed only errors, now seem to remain online, but the space on the server is always exhausted...
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-08-2017, 08:13 PM   #106
Barry-xlovecam
It's 42
 
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
root@ds12-ams-2gb:/home# du -sh
or were your web root is

/var/www ?

du -h

will be more verbose

check the webroot are you caching any content?
Barry-xlovecam is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-08-2017, 09:25 PM   #107
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380
Code:
root@ubuntu-2gb-blr1-14-04-3:/var/www/html# du -sh
988M	.
root@ubuntu-2gb-blr1-14-04-3:/var/www/html# du -h
612K	./tranny-hardpics.com/images
24K	./tranny-hardpics.com/admin
93M	./tranny-hardpics.com
1.7M	./pornstarvideoupdates.com/images
28K	./pornstarvideoupdates.com/admin
110M	./pornstarvideoupdates.com
8.0K	./function_global/poptm
36K	./function_global/chaturbate_banner_rss
12K	./function_global/bongacash_banner
12K	./function_global/traffic_company_banner/mio iframe
32K	./function_global/traffic_company_banner
68K	./function_global/page_banner
12K	./function_global/hilltopads code.txt
8.0K	./function_global/pdo
12K	./function_global/popads_adblock
16K	./function_global/popads
8.0K	./function_global/mobile_detect/export
8.0K	./function_global/mobile_detect/namespaced/Detection
12K	./function_global/mobile_detect/namespaced
288K	./function_global/mobile_detect/tests/providers/vendors
292K	./function_global/mobile_detect/tests/providers
776K	./function_global/mobile_detect/tests
24K	./function_global/mobile_detect/examples
1.2M	./function_global/mobile_detect
160K	./function_global/juicyads_banner
12K	./function_global/clickadu
1.9M	./function_global
684K	./alternativegirlshardpics.com/images
20K	./alternativegirlshardpics.com/admin
28M	./alternativegirlshardpics.com
656K	./tranny-search.com/images
20K	./tranny-search.com/admin
80M	./tranny-search.com
4.0K	./cdn
676K	./veryhardpics.com/images
24K	./veryhardpics.com/admin
79M	./veryhardpics.com
636K	./tranny-beauty.com/images
24K	./tranny-beauty.com/admin
63M	./tranny-beauty.com
1.6M	./bigboobsupdate.com/images
28K	./bigboobsupdate.com/admin
4.0K	./bigboobsupdate.com/cgi-bin
242M	./bigboobsupdate.com
988M	.


That I know, the only cache systems currently installed are APCU, memcached, Opcache

CDN is the only folder with about 15000 photos, but it is the one that I can not extract because it is finished space
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-10-2017, 02:00 AM   #108
Barry-xlovecam
It's 42
 
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
(15000*100K)*1000
1,500,000,000

/CDN is 1.5 GB maybe

why are there no users shown? /home/user
what is 'finished space' supposed to mean?
root should be able to access all locations.
Barry-xlovecam is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-11-2017, 07:26 AM   #109
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380


No, CDN is still empty, I managed to load the zip, but when I try to extract, it extracts some photos and then "Space finished".

Also extracted, CDN weighs about 309 Mb



The strange thing is also that some counters see exhausted space, other half empty...

Maybe some counters don't see the data in some caches?

I'm still thinking about those damned "Standard HTTP caching", which saved here /var/cache/apache2/mod_cache_disk, and actually in this folder there is still something...





I would try to empty it/delete it, but can I do it with an "rm", or will it destroy the server?
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-11-2017, 07:39 AM   #110
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380
Umh...I did this

Code:
root@ubuntu-2gb-blr1-14-04-3:/var/cache/apache2/mod_cache_disk# du -sh
11G	.
But I also found this

File: /etc/default/apache2
Code:
### htcacheclean settings ###

## run htcacheclean: yes, no, auto
## auto means run if /etc/apache2/mods-enabled/cache_disk.load exists
## default: auto
HTCACHECLEAN_RUN=auto

## run mode: cron, daemon
## run in daemon mode or as daily cron job
## default: daemon
HTCACHECLEAN_MODE=daemon

## cache size
HTCACHECLEAN_SIZE=300M

## interval: if in daemon mode, clean cache every x minutes
HTCACHECLEAN_DAEMON_INTERVAL=120

## path to cache
## must be the same as in CacheRoot directive
HTCACHECLEAN_PATH=/var/cache/apache2/mod_cache_disk

## additional options:
## -n : be nice
## -t : remove empty directories
HTCACHECLEAN_OPTIONS="-n"
It seems that it should be limited and eliminated regularly...

But something seems not to have worked properly...
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-11-2017, 08:27 AM   #111
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380
I launched this:
Code:
root@ubuntu-2gb-blr1-14-04-3:~# htcacheclean -p/var/cache/apache2/mod_cache_disk -l 1
Ubuntu Manpage: htcacheclean - Clean up the disk cache

Something he did
Code:
  System load: 0.0                Memory usage: 3%   Processes:       70
  Usage of /:  32.0% of 39.34GB   Swap usage:   0%   Users logged in: 0
Although I expected more by removing 11 Gb...
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-14-2017, 08:47 AM   #112
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380


Much better than before, but the time still seems a little high...

Will it improve over time? Can I improve it in some way?
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-14-2017, 01:27 PM   #113
Barry-xlovecam
It's 42
 
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
If you are using Varnish you are caching pages -- and taking up space -- see if you can purge the pages little used on a daily basis?

If you request images from other servers you may have slow page load times depending on the number of images requested, the geolocation and peering to your server(s) and the current load on the server you are requesting images from.

Fewer images per page might help. Using jQuery lazy load in your HTML might help also. The initial load time should stat out better.
Barry-xlovecam is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-15-2017, 08:55 AM   #114
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380
Varnish I installed it, but it practically never worked, on Digitalocean Varnish and Apache quarrel for the port 80 due to some symbolic link. The thing was resolved on 16.04, but a little abandoned to itself on the one of 14.04.
Currently it seems that or start Apache or start varnish, together do not want to work

Unfortunately almost all images of my sites are on the sites of content producer, on which I have no control.
I tried long ago to create the thumbnails and host them on my server (in the CDN folder) using CloudFlare for cache and CDN, but I lost about 80% of the visits...
Is still more or less active here: Big Boobs Hard Pics | Big Boobs, Huge Boobs, Huge Tits, Busty,, but I have not recovered all the visits.

For some time I have installed a lazyload http://www.lezdomhardtube.com/lazysizes.min.js, but not that of jquery, because only the jquery framework weighs practically more than the code of my sites.
Of this lazyload I am not very convinced, because it does not load all the photos of the page, but it loads more than those visible in the window... seems to do his job, but a little too much... and I have not noticed significant changes between first and after the implementation of this Lazyload...
As soon as I have free time I try another, just to try.

Always thanks for the answers
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-15-2017, 10:26 AM   #115
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380
I'm trying to focus on the headers for speed up.

Because for example I have enabled mod_deflate, but I do not understand if it is compressing and what is compressing...

I have all its nice rules in the htaccess files of each site, but I have no idea what it is, or where it is, the configuration of my server.

Code:
<IfModule mod_deflate.c>
      <IfModule mod_setenvif.c>
            BrowserMatch ^Mozilla/4 gzip-only-text/html
            BrowserMatch ^Mozilla/4\.0[678] no-gzip
            BrowserMatch \bMSIE !no-gzip !gzip-only-text/html
            BrowserMatch \bMSI[E] !no-gzip !gzip-only-text/html
      </IfModule>
      <IfModule mod_headers.c>
            Header append Vary User-Agent env=!dont-vary
      </IfModule>
AddOutputFilterByType DEFLATE text/css application/x-javascript text/html text/richtext image/svg+xml text/plain text/xsd text/xsl text/xml image/x-icon
</IfModule>
In cpanel There is the option "compress all" and you have no way to do anything else.
With the server however I saw that there are configurations on the level of compression, but I can not even find the configuration files, in the sense that I find the files, but inside there is nothing of everything that is spoken on the guides...



Another thing for example I would like to add CharSet: UTF-8 to the Content-type header

I have seen guides where they say they are in the httpd. conf file that I do not find or in the Apache configuration files, and in mine there is nothing about it...

I'm not understanding anything...

This is the current configuration of the headers of my sites, definitely missing something, but I can not figure out how to add or edit.. (except via httaccess)



What and how can I configure server headers globally, without using individual htaccess?
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-18-2017, 10:06 AM   #116
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380
So... in a couple of days I realized something... but I realized that practically there is nothing to understand...

Deflate already seems to do everything alone and work very well as it is...

And even less touches better is...

I have however found that some tools like Pagespeed, Gtmetrix, varvy, say that compression is not enabled because of these 2 files:
search.js
lazysizes.min.js

I added to /etc/apache2/mods-enabled/deflate.conf "text/javascript" (idea found searching) and search.js seems to have resolved, but continue to tell me that compression is not enabled for lazysizes.min.js, perhaps because of the "min", which maybe does recognize the extension.

I could make a change in my sites and remove the min from the filename, but it would take a long time, and surely sooner or later the thing would repeat in the future.

Is there a way to fix it permanently via server, the failure to compress .js and .min.js files?


P.S.
For the Charset Utf-8 I realized that it is in this file:/etc/apache2/conf-available/charset.conf
And just enable this: AddDefaultCharset UTF-8

P.P.S.
Considering that almost all images of my sites come from external resources, can it be a good idea to enable compression even for images via deflate?
Or would it completely kill my server's resources?
Does deflate also affect images from external resources?

P.P.P.S.
In Pagespeed I noticed for the first time Pagespeed module for Apache (I had never noticed before not having a server).
Can be a good or bad idea?
Usually I do not trust too much of BigG because he has a tendency to take much more than he gives, and I do not want to give free the resources of my server to him. He doesn't really need it.
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-18-2017, 10:46 AM   #117
Barry-xlovecam
It's 42
 
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
It might be just easier to spend another $20/mo and expand the server's resources?

Quote:
Does deflate also affect images from external resources?
NO
that is a remote server you cannot control.

use

$ sed 's/\.min//g'
test it first
then sed -i.bk
-i [in place edit].bk [.bk backupfile]

I like to make a backup directory with copies in case I fuck-up
Code:
$ mkdir backedup; cp * backedup
you will need the file paths
Code:
$ find . -name "*.min.js"
or
Code:
$ find . -name "*.js" -o  -name "*.min.js"
find . is recursive so start in the right location near just above the files.

to find script references
is recursive so start in the right location near just above the files.
Code:
$ grep -rni '.js'
JavaScripts are generally cashed in browsers -- their requests are 304 usually after the first visit -- certainly per session. Exceptions being deleted user cache files and private/incognito sessions.
Barry-xlovecam is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-18-2017, 11:55 AM   #118
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380
But I know the file, is the famous lazyload (sorry if I have not written it before, I did not thinking about)

But I can not understand why it is not compressed by deflate
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-20-2017, 12:54 PM   #119
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380
I was thinking of installing Fail2ban, but I saw that it reads Apache errors.

So much to look at, I opened Apache errors
I noticed that my Apache logs are full of this:
Code:
[Mon SEP 18 06:39:16.678185 2017] [Core: ERROR] [PID 31667] [Client 180.76.15.6:29891] AH00124: Request exceeded the limit of 10 internal redirects due to probable configuration error. Use ' LimitInternalRecursion ' to increase the limit if necessary. Use ' LogLevel debug ' to get a backtrace.
[Mon SEP 18 06:39:18.149155 2017] [Core: ERROR] [PID 31837] [Client 180.76.15.23:55177] AH00124: Request exceeded the limit of 10 internal redirects due to probable configuration error. Use ' LimitInternalRecursion ' to increase the limit if necessary. Use ' LogLevel debug ' to get a backtrace.
[Mon SEP 18 06:39:18.653379 2017] [Core: ERROR] [PID 31984] [Client 180.76.15.141:41970] AH00124: Request exceeded the limit of 10 internal redirects due to probable configuration error. Use ' LimitInternalRecursion ' to increase the limit if necessary. Use ' LogLevel debug ' to get a backtrace.
The IP seems to be Baidu, which I block via htaccess in all my sites with:
Code:
RewriteCond %{HTTP_USER_AGENT} ^.*MJ12bot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Yandex [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Baidu [NC]
RewriteRule .*  - [L,F]
(It is a rule that they have made me add various hosting and VPS services, to try to find resources to keep sites online. I don't even know if it's a good thing or not...)

I'm afraid that installing Fail2ban would read continually new errors and probably he will drink all the resources of my server...

(also the fact that each visit generates a log line I do not like so much)

Looking in G, it seems to be some url_rewriter problem.

Looking for logLevel debug I found this:
mod_rewrite - Apache HTTP Server Version 2.4

But I think I didn't understand something because:
Code:
root@ubuntu-2gb-blr1-14-04-3:~# tail -f error_log|fgrep '[rewrite:'
tail: cannot open ‘error_log’ for reading: No such file or directory
root@ubuntu-2gb-blr1-14-04-3:~# LogLevel alert rewrite:trace3
LogLevel: command not found

What am I missing? How should I use this?
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-20-2017, 04:52 PM   #120
Barry-xlovecam
It's 42
 
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
Quote:
Originally Posted by porn-update View Post
I was thinking of installing Fail2ban, but I saw that it reads Apache errors.....

But I think I didn't understand something because:
Code:
root@ubuntu-2gb-blr1-14-04-3:~# tail -f error_log|fgrep '[rewrite:'
tail: cannot open ‘error_log’ for reading: No such file or directory
root@ubuntu-2gb-blr1-14-04-3:~# LogLevel alert rewrite:trace3
LogLevel: command not found

What am I missing? How should I use this?
You need to find the location of the error log for that domain.

use find or locate or read the configuration file for that domain

Code:
$ cd /etc/apache2/sites-available
$ tac <domain file> |less
or from ~ of root


Code:
$ (cd /etc/apache2/sites-available &&  grep -i 'error\.log' <domain config file>
then when you have the location (path)

Code:
$ tac <path/to/file/error.log> |less
this will read the file starting from the last line. less is one screen at a time, space bar next page -- return next line, q is quit and close the command (tac) tac is cat backwards -- get it?

Code:
$ tac <path/to/file/error.log> egrep -i 'this|or|that' |less

$  man grep
#for more command options
Barry-xlovecam is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-20-2017, 05:03 PM   #121
Barry-xlovecam
It's 42
 
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
deny from 180.76.15.0/24

but I think that iptables or ufw firewalls are a better way to go than .htaccess

Baidu is velly sneeky ...


Code:
root@(none):~# dig ANY baidu.com

; <<>> DiG 9.7.3 <<>> ANY baidu.com
;; global options: +cmd
;; Got answer:
;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 9342
;; flags: qr rd ra; QUERY: 1, ANSWER: 15, AUTHORITY: 0, ADDITIONAL: 2

;; QUESTION SECTION:
;baidu.com.			IN	ANY

;; ANSWER SECTION:
baidu.com.		7200	IN	SOA	dns.baidu.com. sa.baidu.com. 2012136870 300 300 2592000 7200
baidu.com.		7200	IN	TXT	"v=spf1 include:spf1.baidu.com include:spf2.baidu.com include:spf3.baidu.com a mx ptr -all"
baidu.com.		7200	IN	TXT	"google-site-verification=GHb98-6msqyx_qqjGl5eRatD3QTHyVB6-xQ3gJB5UwM"
baidu.com.		7200	IN	MX	20 mx50.baidu.com.
baidu.com.		7200	IN	MX	10 mx.n.shifen.com.
baidu.com.		7200	IN	MX	20 mx1.baidu.com.
baidu.com.		7200	IN	MX	20 jpmx.baidu.com.
baidu.com.		539	IN	A	123.125.114.144
baidu.com.		539	IN	A	220.181.57.217
baidu.com.		539	IN	A	111.13.101.208
baidu.com.		86400	IN	NS	ns4.baidu.com.
baidu.com.		86400	IN	NS	ns7.baidu.com.
baidu.com.		86400	IN	NS	dns.baidu.com.
baidu.com.		86400	IN	NS	ns3.baidu.com.
baidu.com.		86400	IN	NS	ns2.baidu.com.

;; ADDITIONAL SECTION:
mx1.baidu.com.		300	IN	A	61.135.163.61
jpmx.baidu.com.		7200	IN	A	61.208.132.13

;; Query time: 252 msec
;; SERVER: 75.127.97.7#53(75.127.97.7)
;; WHEN: Wed Sep 20 19:56:06 2017
;; MSG SIZE  rcvd: 509
Barry-xlovecam is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-20-2017, 06:35 PM   #122
Barry-xlovecam
It's 42
 
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
Quote:
Originally Posted by porn-update View Post
I was thinking of installing Fail2ban, but I saw that it reads Apache errors.

But I think I didn't understand something because:
Code:
root@ubuntu-2gb-blr1-14-04-3:~# tail -f error_log|fgrep '[rewrite:'
tail: cannot open ‘error_log’ for reading: No such file or directory
root@ubuntu-2gb-blr1-14-04-3:~# LogLevel alert rewrite:trace3
LogLevel: command not found

What am I missing? How should I use this?
Sorry, long day -- that log's location /path/to/THAT/error_log
full path to THAT log not the domain config log.

Also pipe into the grep
$ tac <path/to/file/error.log> | egrep -i 'this|or|that' |less
Barry-xlovecam is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-21-2017, 08:50 AM   #123
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380
Thanks a lot for the answers

Unfortunately I see myself forced to postpone this thing because of 247host, which right now has suspended the account of my second server.

The one where I only keep testing sites or that make a few visits 10-200 range about, have it suspended for excessive resource consumption...
They are about 40 sites, but practically do nothing...

It is the second time in a week that compel me to buy a more expensive service.
They do not warn, do not give warnings, do not say anything, immediately suspend the account and send you an email with the link to buy a more expensive service...
Along with the account also die all mails... including those of work...

I would have changed short service, but I was hoping to be able to do everything quietly, and instead not...

They're really shit.




Anyway, leaving aside the anger...

I thought I would try Linode, so as not to have them all on Digitalocean, a small server but fully updated, with 16:04 or 17:04, PHP7 etc etc etc...

Particular indications or differences between Linode and Digitalocean?




P.S. As I am happy, I will spend the next week moving sites again... always because of these crappy services...
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-21-2017, 09:55 AM   #124
Barry-xlovecam
It's 42
 
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
I use both.
I have used Linode for 9 or 10 years.

I think they are comparable.
I am going to try Leaseweb when I have time -- their pricing for small VPS is very good -- have to see if there are any issues ...

I think you have PHP code issues if you are exceeding your resources. error_log is a PHP error log name?

you might get some of the script errors in you use the php cli output at the terminal

Code:
cd path/
cd php <script name>.php
you may need to install php cli
Barry-xlovecam is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-21-2017, 10:09 AM   #125
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380
Linode does not accept my credit card...

So maybe Digitalocean for the moment, if I can get some mail, since my mails were in the account closed...

Today is really a shitty day
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-21-2017, 12:24 PM   #126
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380
So, I moved my sites with the mails here Freehostia and set the mails.

Now I receive all the mails of the world, except those with the password of the new droplet of Digitalocean

What do you think about vultr?
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-21-2017, 03:46 PM   #127
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380
In the end I managed to install a new droplet LAMP 16.04 in Digitalocean

The mails do not work yet, but I managed to access the new droplet via SSH key authentication and changed the password
(with some difficulty, because as usual in the guides of Digitalocean always missing a piece)

I installed pretty much all the lamp server fast enough (with all the times I threw away and restarted in 14.04, I have now learned)

I'm starting to upload my sites in a bit.

This time however I really need the mail, 2 on two different domains.

How to proceed? Install all a mail server on my droplet?
Postfix? Dovecot? Roundcube?

Are there any other alternatives? I just need to be able to configure them in my Thunderbird and receive and send mail.
And maybe have a place to see them online if I'm not at home pc...
And maybe if possible recover the backup of cpanel...
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-22-2017, 08:40 AM   #128
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380
Maybe Zoho?
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-22-2017, 08:53 AM   #129
Barry-xlovecam
It's 42
 
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
You need access to your DNS records to change the MX entry.

You can enter a spf text record sending the email (MX) to some other location.
Barry-xlovecam is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-22-2017, 12:09 PM   #130
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380
So, I configured my mails with Zoho, with DNS, SPF etc, for the moment I am not very relaxed because I notice some strange behavior in the configuration phase through Digitalocean and I received some error messages, such as non-existent domain or relay disabled, I tried to send me mail through the same mail or to them, and not all the message arrived...

I hope it's DNS propagation issues, we wait a couple of days and see if everything starts to work...

Just in case, are there any other similar free services?

In the next few days I will to restore my sites, from 247host, after 2 days they gave me a backup that weighs 4.7 GB in which however missing about 40 sites... they are really crap...
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-22-2017, 01:11 PM   #131
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380
I have a problem, the MySQL server goes away...

Warning: PDOStatement:: Execute (): MySQL server has gone away in/var/www/html/xxxhashtag.com/sitemap_generator.php on line 88

I am reloading the sites from old backups, but I need to sync the sitemaps.

I have a script, which reads the new data in the database, adds links to the Sitemap, and updates the database by entering the value "Insitemap = 1", the next start reads only the data with value "Insitemap = 0".

Normally everything works, but now I have to sync everything in the Sitemap, about 600.000 lines...

The script will stay there quite a bit, then it returns errors of this type.

I have increased the limits on mysql.cnf and php.ini, but still does not work, what else can I do?
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-25-2017, 01:38 PM   #132
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380
So,
For the emails I solved with Yandex, it works... Zoho kept doing strange things...

For MySQL, I tried different configurations, several restarts etc, but I did not have great results.

I doubled the performance of the server and I installed a local lamp where I recreated the sitemaps that I then uploaded to the server.
For the moment I solved that.

I still can not run the files of the cronjob manually (the server remains stationary on a white page for hours), I created the cronjob with curl, let's see if it succeeds in completing the processing of the file or at least indicates some error...

I have already found a problem with version 5.7 of MySQL, to him are unsympathetic to the GROUP BY, for now I solved disabling "Sql_mode = only_full_group_by", I have too many queries to modify to do it now... in the future I update the sites.
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-25-2017, 07:29 PM   #133
Barry-xlovecam
It's 42
 
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
my best thought it to run that script you are having all those issues with in a php cli

Code:
$ cd /path/to/script/
PHP Code:
<?php
/*comment out for production*/
//error_reporting(E_ALL);
//ini_set( 'display_errors', true );
/**********************************/
put full error reporting on
Code:
$ php scriptname.php
maybe you will get some troubleshooting clues ...
Barry-xlovecam is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-26-2017, 11:42 AM   #134
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380
It shows me the PHP code of the file... as did the cronjob on 14.04...

Weird, weird, weird...

Looking online the first (and perhaps only) solution you find is: You forgot to enable "Short_open_tag" on php.Ini. But it's the first thing I've activated..
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-26-2017, 01:22 PM   #135
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380
So,
To begin with I found this
Code:
2017-09-26T19:51:42.376416Z 678542 [ERROR] /usr/sbin/mysqld: Table './bigbreasthardpics/tgp_search' is marked as crashed and last (automatic?) repair failed
2017-09-26T19:51:43.239533Z 678552 [ERROR] /usr/sbin/mysqld: Table './bigbreasthardpics/tgp_search' is marked as crashed and last (automatic?) repair failed
2017-09-26T19:51:43.462009Z 678562 [ERROR] /usr/sbin/mysqld: Table './bigbreasthardpics/tgp_search' is marked as crashed and last (automatic?) repair failed
2017-09-26T19:51:43.595315Z 678564 [ERROR] /usr/sbin/mysqld: Table './bigbreasthardpics/tgp_search' is marked as crashed and last (automatic?) repair failed
2017-09-26T19:51:44.424527Z 678574 [ERROR] /usr/sbin/mysqld: Table './bigbreasthardpics/tgp_search' is marked as crashed and last (automatic?) repair failed
It probably messed things up a lot... I reloaded the table and now it seems to work.

Then I have decomposed my cronjob, good or bad almost all files seem to work, albeit slowly

But a particular file remains stationary for hours, is a file that synchronizes data between these sites: adulthashtag.com, xxxhashtag.com, porn-tags.com, Exchange each day the new rows added to the database and add each one that is missing.

The swap file is a .txt and weighs about 50kb and usually contains 1500-2000 new rows.

Each file is on its own site and is uploaded to others with a function like this:
Code:
$file = "Http://www.adulthashtag.com/new_query_search.txt";
foreach (File ($file) as $search _ Query_row) {
I doubt that there is some restriction of PHP or firewall that prevents the use of files outside the site and blocks everything for hours?

I have these in my syslog, UFW should be the firewall, but I did not understand if
Is he just doing his job or if he's blocking me???
Code:
Sep 26 06:31:34 ubuntu-1gb-nyc3-01 kernel: [36766.523428] [UFW BLOCK] IN=eth0 OUT= MAC=36:1a:36:97:ff:ba:84:b5:9c:f9:08:30:08:00 SRC=210.146.241.198 DST=104.236.230.48 LEN=52 TOS=0x00 PREC=0x00 TTL=115 ID=17950 DF PROTO=TCP SPT=58001 DPT=8118 WINDOW=8192 RES=0x00 SYN URGP=0 
Sep 26 06:31:55 ubuntu-1gb-nyc3-01 kernel: [36786.721018] [UFW BLOCK] IN=eth0 OUT= MAC=36:1a:36:97:ff:ba:84:b5:9c:f9:18:30:08:00 SRC=138.201.19.161 DST=104.236.230.48 LEN=56 TOS=0x02 PREC=0x00 TTL=119 ID=14604 DF PROTO=TCP SPT=63002 DPT=8118 WINDOW=8192 RES=0x00 CWR ECE SYN URGP=0 
Sep 26 06:32:15 ubuntu-1gb-nyc3-01 kernel: [36807.234634] [UFW BLOCK] IN=eth0 OUT= MAC=36:1a:36:97:ff:ba:84:b5:9c:f9:18:30:08:00 SRC=46.161.9.49 DST=104.236.230.48 LEN=60 TOS=0x00 PREC=0x00 TTL=57 ID=13838 DF PROTO=TCP SPT=50016 DPT=8118 WINDOW=14600 RES=0x00 SYN URGP=0 
Sep 26 06:32:34 ubuntu-1gb-nyc3-01 kernel: [36826.106792] [UFW BLOCK] IN=eth0 OUT= MAC=36:1a:36:97:ff:ba:84:b5:9c:f9:18:30:08:00 SRC=213.136.75.227 DST=104.236.230.48 LEN=60 TOS=0x00 PREC=0x00 TTL=53 ID=12944 DF PROTO=TCP SPT=47055 DPT=8118 WINDOW=29200 RES=0x00 SYN URGP=0
The cronjob should have been executed by now, but have not given signs of life... I try to install Postfix as the other time in 14.04... although this time I see no references to postfix in the logs, hopefully help
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-27-2017, 10:19 AM   #136
Barry-xlovecam
It's 42
 
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
check those IPs;
$ whois ip

the 104.xxx is you server?

https://lists.torproject.org/piperma...ch/004159.html

https://lists.torproject.org/piperma...ch/004160.html

possibly ... the former IP user?
Barry-xlovecam is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-27-2017, 12:03 PM   #137
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380
Yes, and I usually use Tor, but not so often to fill the logs.

I thought... could it be yandex with emails?
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-27-2017, 12:19 PM   #138
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380
After the problem with the table Bigbreast I have taken a look at all the databases

I'm finding a lot of MySQL import errors from the backup of 247host.
Missing auto increment, primary key, default value, null value, etc etc etc.

Practically almost all the imported tables have problems...

The backup of 247host is really a crap...

Maybe it is the fault of MySQL problems if I find errors in the firewall, I do not receive news of the conjob, I can not start files etc etc etc???
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-28-2017, 01:28 PM   #139
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380
So, after correcting an infinity of problems on MySQL caused by importing, something seems to start working

Also conjobs and the problem of cross-site synchronization was also due to the missing of an auto increment field.

Now the idea is to leave the server free to run for a few days, and see what happens... Especially see if this chart normalizes.



Now it scares me...
but also in the other server in the first days consumed many more resources than its normal use, and he did not have all these problems on the MySQL tables.
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-28-2017, 03:22 PM   #140
Barry-xlovecam
It's 42
 
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
These are not server problems they are software problems.
  1. I would back up the tables you have now
  2. then truncate the data in the tables
  3. correct any column type errors
  4. then repopulate the tables with new data using your PHP script
  5. if the cpu use is too high then -- you have some errors or a memory leak in your PHP script
.
Barry-xlovecam is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-29-2017, 11:55 AM   #141
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380
In the end I did just that, I installed a local lamp and recreated all the possible tables via script.

The imported ones were not recoverable, in many tables were missing the auto increment and primary fields, and the auto increment fields were filled with empty fields, missing numbers and zeros, etc etc etc., really a mess...

Recreating the tables locally and importing mine, everything seems much more correct.

Today it has also arrived the first report by mail of the cronjob of xxxhashtag.com.
But something maybe still does not work... The cronjob took 8800 seconds... usually this cronjob takes 2...

Now I have to figure out if there's still something wrong or if he's just dealt with a heavy sync with other sites.
Unfortunately the backups I had were not up to date and Xxxhashtag works with many sites and many databases.
Usually it checks the data of the last 24 hours, but having restored the databases, probably many data have been added in the last 24 hours...

We will see in the next few days...
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 09-29-2017, 02:40 PM   #142
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380
Looking through the tables I found a lot of duplicate indexes that I did not create...

Code:
CREATE TABLE IF NOT EXISTS `xxxhashtag_search` (
  `id_search` int(11) NOT NULL AUTO_INCREMENT,
  `query` varchar(255) NOT NULL,
  `views` int(11) NOT NULL DEFAULT '1',
  `insitemap` int(1) NOT NULL DEFAULT '0',
  `insitemap_link` int(1) NOT NULL DEFAULT '0',
  `insitemap_link2` int(1) NOT NULL DEFAULT '0',
  `data_ins` varchar(255) NOT NULL DEFAULT '1388796621',
  `last_mod` varchar(255) DEFAULT '1415144202',
  `engine` varchar(255) NOT NULL,
  PRIMARY KEY (`id_search`),
  KEY `query` (`query`),
  KEY `query_2` (`query`),
  KEY `query_3` (`query`),
  KEY `query_4` (`query`),
  FULLTEXT KEY `query_5` (`query`),
  FULLTEXT KEY `query_6` (`query`)
) ENGINE=MyISAM  DEFAULT CHARSET=latin1 AUTO_INCREMENT=925373 ;
By removing the indexes in addition the script from 8000 seconds now takes 0.2 seconds...

Now I check all the other tables looking for duplicate indexes...
(I will take about 3 days, as I am happy... )
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 10-02-2017, 02:00 PM   #143
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380
And now...

Everything seems to begin to work... and begin to come to mind questions like:

Can I make some backups?
Would only the databases (all files I can restore from the databases), would weekly, but the procedure should not weigh too much on the server (some MySQL databases now weigh 60-70 MB of data and the server is small).

Maybe it would also be nice if the databases were sent somewhere, like my pc, or Yandex disk, or a gigamail, or something like that... just to be sure that if the server goes on fire them are somewhere else... but anyway all this should not abuse server resources too much.


Then, what will be the fastest way to move a site to another server? (Always unmanaged)

I was a little bit the desire to Linode or maybe vultr (although I asked, but I did not understand if Vultr accepts adult)

For quite a while my sites will remain here, after all the effort I made, however I wonder, if I wanted to move a site I have to reload one by one all the databases via phpMyAdmin and via FTP all files?
Exist a faster, more practical and more secure way?
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 10-03-2017, 07:19 AM   #144
Barry-xlovecam
It's 42
 
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
SSH

Warning Will Robinson: Use mysql as root user so you can lock the tables

make a directory for your mysql backups
cd to that directory you make

Code:
#!use root to locktables
$ mysqldump --add-drop-table  -u root  -p [DATABASE NAME] >[DATABASE NAME].backup.$(date +%F).sql
Enter password:
/home/user/****.com/****/[DATABASE NAME].backup.2017-09-24.sql
is made^^
use scp or rsync to move the backup to other locations

You will have to

create the database user and grant permissions as needed

CREATE [DATABASE NAME];

then read this
https://stackoverflow.com/questions/...-line-in-mysql
Barry-xlovecam is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 10-03-2017, 02:57 PM   #145
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380
So...
I have taken a look and tried to understand everything.

To restore I understood, because I had to use the command line to load those damn corrupted databases, otherwise via phpMyAdmin I could not import them.

For backup I have a few questions:
Can I create a single file to use in crontab?
Eg. .sh executable... (I do not know what they are called or what they are, but I happened to make someone on my PC...) to create a single file with all the commands in order to backup all the databases and use it in a cronjob?

And launch this file once a week or once a month via crontab?
It depends on how many resources it consumes by launching all backups together.

I did not know SCP, but I like it... So much, I have a little raspberry PI attached to my router that spends the day making backups of my data between my PCs, the mobile phone, the tablet and some cloud services.

So if I can take SCP backups of the databases on my raspberry, he save them anywhere...

The backup files in order to be "taken" with SCP must be in home/user/?
Also if ssh logs in as root?






------------------------------------------------------------------
Other little thing out of topic but it came a minute ago...

Doing
Code:
sudo apt-get update
sudo apt-get upgrade
In 14.04 the server says this:
Code:
Processing triggers for libapache2-mod-php5.6 (5.6.31-6+ubuntu14.04.1+deb.sury.org+1) ...
Processing triggers for php5.6-fpm (5.6.31-6+ubuntu14.04.1+deb.sury.org+1) ...
php5.6-fpm stop/waiting
php5.6-fpm start/running, process 26417
NOTICE: Not enabling PHP 5.6 FPM by default.
NOTICE: To enable PHP 5.6 FPM in Apache2 do:
NOTICE: a2enmod proxy_fcgi setenvif
NOTICE: a2enconf php5.6-fpm
NOTICE: You are seeing this message because you have apache2 package installed.
php5.6-fpm stop/waiting
php5.6-fpm start/running, process 26465
Should I do that?
I installed PHP 5.5 along with the lamp, then PHP 5.6 later, but I don't remember ever asking for FPM...
Better to use FPM on the 14.04? Is fpm now obsolete? If I enable it I have to remake hp.ini configuration, re-enable opcache, etc etc etc.?
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 10-04-2017, 03:51 PM   #146
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380
I found this script, almost perfect for what I want to do

Code:
#!/bin/bash
# Shell script to backup MySql database
# To backup Nysql databases file to /backup dir and later pick up by your
# script. You can skip few databases from backup too.
# For more info please see (Installation info):
# http://www.cyberciti.biz/nixcraft/vivek/blogger/2005/01/mysql-backup-script.html
# Last updated: Aug - 2005
# --------------------------------------------------------------------
# This is a free shell script under GNU GPL version 2.0 or above
# Copyright (C) 2004, 2005 nixCraft project
# -------------------------------------------------------------------------
# This script is part of nixCraft shell script collection (NSSC)
# Visit http://bash.cyberciti.biz/ for more information.
# -------------------------------------------------------------------------
 
MyUSER=username     # USERNAME
MyPASS=password   # PASSWORD
MyHOST=hostname        # Hostname
 
# Linux bin paths, change this if it can't be autodetected via which command
MYSQL="$(which mysql)"
MYSQLDUMP="$(which mysqldump)"
CHOWN="$(which chown)"
CHMOD="$(which chmod)"
GZIP="$(which gzip)"
 
# Backup Dest directory, change this if you have someother location
DEST="/var/backup"
 
# Main directory where backup will be stored
MBD="$DEST/mysql"

#elimino vecchi backup
rm -f $MBD/*
 
# Get hostname
HOST="$(hostname)"
 
# Get data in dd-mm-yyyy format
NOW="$(date +"%d-%m-%Y")"
 
# File to store current backup file
FILE=""
# Store list of databases
DBS=""
 
# DO NOT BACKUP these databases
IGGY="information_schema cond_instances mysql performance_schema phpmyadmin"
 
[ ! -d $MBD ] && mkdir -p $MBD || :
 
# Only root can access it!
$CHOWN 0.0 -R $DEST
$CHMOD 0600 $DEST
 
# Get all database list first
DBS="$($MYSQL -u $MyUSER -h $MyHOST -p$MyPASS -Bse 'show databases')"
 
for db in $DBS
do
    skipdb=-1
    if [ "$IGGY" != "" ];
    then
        for i in $IGGY
        do
            [ "$db" == "$i" ] && skipdb=1 || :
        done
    fi
 
    if [ "$skipdb" == "-1" ] ; then
        FILE="$MBD/$db.$HOST.$NOW.gz"
        #no gzip, comprimo dopo tutta la cartella
        FILE="$MBD/$db.$HOST.$NOW.sql"

        # do all inone job in pipe,
        # connect to mysql using mysqldump for select mysql database
        # and pipe it out to gz file in backup dir :)
        #$MYSQLDUMP -u $MyUSER -h $MyHOST -p$MyPASS $db | $GZIP -9 > $FILE

        #no gzip, comprimo dopo tutta la cartella
        $MYSQLDUMP -u $MyUSER -h $MyHOST -p$MyPASS $db > $FILE
    fi
done

#comprimo tutto
zip -r $DEST/mysql_backup.$HOST.zip $MBD/

#tar -zcvf $DEST/mysql_backup.$HOST.tar.gz $MBD
I added this to delete last week's backups
Code:
#elimino vecchi backup
rm -f $MBD/*
I added the system databases among those excluded from the process
Code:
# DO NOT BACKUP these databases
IGGY="information_schema cond_instances mysql performance_schema phpmyadmin"
Removed the compression on mysqldump
Code:
#FILE="$MBD/$db.$HOST.$NOW.gz"
        #no gzip, comprimo dopo tutta la cartella
        FILE="$MBD/$db.$HOST.$NOW.sql"

        # do all inone job in pipe,
        # connect to mysql using mysqldump for select mysql database
        # and pipe it out to gz file in backup dir :)
        #$MYSQLDUMP -u $MyUSER -h $MyHOST -p$MyPASS $db | $GZIP -9 > $FILE

        #no gzip, comprimo dopo tutta la cartella
        $MYSQLDUMP -u $MyUSER -h $MyHOST -p$MyPASS $db > $FILE
Added to the end compression of the entire /mysql folder
Code:
#comprimo tutto
zip -r $DEST/mysql_backup.$HOST.zip $MBD/

#tar -zcvf $DEST/mysql_backup.$HOST.tar.gz $MBD
From all this comes a single zipped file, which in theory, I should be able to download with SCP on my raspberry

Crontab? Only this code?
Code:
0 6 * * 4 /var/backup/mysql_backup
Will I receive some output via mail? Maybe the time elapsed to run the script?

I hope I didn't make any big errors... I added, edited, deleted lines, but I practically have no idea what the language with which the file is written... looks like PHP, it seems to work again... it's the best I can say... it would be nice if you would you warn me if I made some horrendous error

One strange thing I saw is that in the compressed file I find the folder structure /var/backup/mysql, while I was expecting only /mysql, not a big problem, but strange...

Now I try to bring everything on my raspberry via SPC, hopefully everything works.

The next and last step will be to understand when to start all to not create problems to the server
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 10-04-2017, 06:37 PM   #147
Barry-xlovecam
It's 42
 
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
rm -f is a bad idea if you do not need it

plain rm is fine
Make a copy like I told you manually
then test your script manually have a plan b

try adding at the bottom of your script
Code:
$ echo "`date` backup done"
this will print this out
Wed Oct 4 21:28:16 EDT 2017 backup done

add this line to your cron and check what happened in the morning
Code:
0 6 * * 4 (cd /var/backup/mysql_backup/; ./backup_script_name.sh) | mail -s "subject backup done" [email protected]
Barry-xlovecam is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 10-05-2017, 01:15 PM   #148
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380
Thank you
I tried everything, and it works!!!

Even the mail arrives... without telling him anything...
(probably because of the link cronjob and postfix, now comes anything has an output)

The output that arrives is the result of zip, but it is OK, at least tells me if it did something and if it worked.

Only strange thing, 16.04 complains a bit about this:
Code:
mysqldump: [Warning] Using a password on the command line interface can be insecure.
But I think I don't have many other alternatives

Rather than the date, can I have the time spent by the script?
To understand how long it used the server.
In php I usually put a time() at the beginning of the script and one at the end and calculating the difference, but here I do not know how to do...
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 10-05-2017, 02:20 PM   #149
Barry-xlovecam
It's 42
 
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
Code:
$ barry@paragon-DS-7:~$ echo `date +%s`; 

sleep 3; #script code here

echo `date +%s`;
outputs;

barry@paragon-DS-7:~$ echo `date +%s`; sleep 3; echo `date +%s`;
1507240274
1507240277

in seconds since epoch (just subtract the values (reversed))


If you don't have a MAIL_TO=
at the top of your crontab you have to state it (or the right email address) in the cron itself.

The password warning is for security. This is not done over the internet so it is a root cron? Well if you can't trust root locally on your server --- reformat fast!

Quote:
Data can be encrypted in the command channel, the data channel, or ideally, both. SCP: Secure Copy, or SCP, does not use FTP or SSL to transfer files, rather Secure Copy handles the file transfer and relies on the SSH protocol to provide authentication and security for both credentials and data.
Unless, you are sending financial data or state secrets, I really would not worry sending a password SCP.
Barry-xlovecam is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 10-06-2017, 12:55 PM   #150
porn-update
Confirmed User
 
porn-update's Avatar
 
Industry Role:
Join Date: Apr 2014
Posts: 380
OK, almost...

Excuse my stupidity, but this programming language sounds strange to me.

So, is that it?

Code:
#At the beginning of the script, this:
STARTTIME = date +%s

#My script

#At the last line, this:
ENDTIME = date +%s
echo $ENDTIME - $STARTTIME;
I missed some ";", some "$", something else?

Sorry if it seems trivial and stupid, but from the script I can hardly understand how to do even the dumbest things.
Compared to my world in PHP I miss "$", ";", I do not understand why the variables are all written in uppercase and how the lines end...
I think I understand that everything still works in cascade and that without $ I define a variable, while with the $ I read and use it, but of everything else I'm not sure...

Just to understand, what programming language is this?



Quote:
Unless, you are sending financial data or state secrets, I really would not worry sending a password SCP.
Here the only ones to have a secret are the transsexuals
porn-update is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Post New Thread Reply
Go Back   GoFuckYourself.com - Adult Webmaster Forum > >

Bookmarks

Tags
vps, centos, cwp, thinking, line, free, command, software, advice, direction, cpanel, memcache, guides, correct, reverse, varnish, tool, litespeed, proxy, initial, acpu, opcache, apache, web, past
Thread Tools



Advertising inquiries - marketing at gfy dot com

Contact Admin - Advertise - GFY Rules - Top

©2000-, AI Media Network Inc



Powered by vBulletin
Copyright © 2000- Jelsoft Enterprises Limited.